![]() AUTOMATED ARM ASSEMBLY FOR USE USED DURING A MEDICAL PROCEDURE ON AN ANATOMICAL PART
专利摘要:
intelligent positioning system and methods therefor The present invention relates to system and methods provided for adaptively and interoperable configuring an automated arm used during a medical procedure. the automated arm is configured to position and orient an end effector on the automated arm at a desired distance and orientation relative to a target. the end effector can be an external video scope and the target can be a surgical port. End effector and target positions and orientations can be continuously updated. the arm position can be moved to new locations in response to user commands. the automated arm may include a multi-joint arm attached to a weighted frame. the weighted frame may include a tower and a supporting beam. 公开号:BR112015023547B1 申请号:R112015023547-6 申请日:2014-03-14 公开日:2022-01-18 发明作者:WOOD Michael;PIRON Cameron;YUWARAJ Murugathas;SELA Gal;RICHMOND Joshua;Mcfadyen Stephen;Panther Alex;Shanmugaratnam Nishanthan;Lau William;M Thomas Monroe;Hodges Wes;Alexander Simon;Gallop David 申请人:Synaptive Medical (Barbados) Inc; IPC主号:
专利说明:
[0001] This application also claims priority over Provisional Patent No. US 61/801 530, entitled “SYSTEMS, DEVICES AND METHODS FOR PLANNING, IMAGING, AND GUIDANCE OF MINIMALLY INVASIVE SURGICAL PROCEDURES” and filed on March 15, 2013, whose full content is incorporated herein by reference. This application also claims priority over Provisional Patent No. US 61/818,280, entitled “SYSTEMS, DEVICES AND METHODS FOR PLANNING, IMAGING, AND GUIDANCE OF MINIMALLY INVASIVE SURGICAL PROCEDURES” and filed on May 1, 2013, the full content of which is incorporated herein by reference. This application also claims priority over Provisional Patent No. US 61/800,695, entitled “INSERTABLE MAGNETIC RESONANCE IMAGING COIL PROBE FOR MINIMALLY INVASIVE CORRIDOR-BASED PROCEDURES” and filed on March 15, 2013, the full content of which is hereby incorporated by title. of reference. This application also claims priority over U.S. Provisional Patent No. 61/818,223 entitled “IMAGING ASSEMBLY FOR ACCESS PORT-BASED MEDICAL PROCEDURES” and filed on May 1, 2013, the entire contents of which are incorporated herein by reference. This application also claims priority over Provisional Patent No. US 61/801, 143, entitled “INSERTABLE MAGNETIC RESONANCE IMAGING COIL PROBE FOR MINIMALLY INVASIVE CORRIDORBASED PROCEDURES” and filed on March 15, 2013, the full content of which is hereby incorporated by way of title. of reference. This application also claims priority over Provisional Patent No. US 61/818,325, entitled “INSERTABLE MAGNETIC RESONANCE IMAGING COIL PROBE FOR MINIMALLY INVASIVE CORRIDORBASED PROCEDURES” and filed on May 1, 2013, the entire contents of which are incorporated herein by reference. . This application claims priority over U.S. Provisional Patent No. 61/801, 746, entitled “INSERT IMAGING DEVICE” and filed March 15, 2013, the entire contents of which are incorporated herein by reference. This application claims priority over U.S. Provisional Patent No. 61/818,255, entitled “INSERT IMAGING DEVICE” and filed on May 1, 2013, the entire contents of which are incorporated herein by reference. This application also claims priority over Provisional Patent No. US 61/800, 155, entitled “PLANNING, NAVIGATION AND SIMULATION SYSTEMS AND METHODS FOR MINIMALLY INVASIVE THERAPY” and filed on March 15, 2013, the entire contents of which are hereby incorporated into reference title. This application also claims priority over Provisional Patent No. US 61/924,993, entitled “PLANNING, NAVIGATION AND SIMULATION SYSTEMS AND METHODS FOR MINIMALLY INVASIVE THERAPY” and filed on January 8, 2014, the full content of which is hereby incorporated by way of reference. FIELD OF THE INVENTION [0002] The present disclosure relates to mechanically aided positioning of medical devices during medical procedures. BACKGROUND OF THE TECHNIQUE [0003] Intracranial surgical procedures present novel treatment opportunities with the potential for significant improvements in patient outcomes. In the case of port-based surgical procedures, many existing optical imaging devices and modalities are incompatible due to a number of reasons which include, for example, poor imaging sensor field of view, magnification and resolution, poor device alignment imaging with access port vision, a lack of access port tracking, problems associated with glare, the presence of excess fluid (eg, blood or cranial spinal fluid) and/or fluid vision occlusion. Furthermore, attempts to use currently available imaging sensors for port-based imaging would result in inadequate image stabilization. For example, a camera manually aligned to image the access door would be susceptible to misalignment when regularly bumped, shaken, or otherwise inadvertently moved by personnel, as well as having an inherent set-up time associated with vibrations. Port-based optical imaging is further complicated by the need to switch to different fields of view for different stages of the procedure. Additional complexities associated with gateway-based optical imaging include the inability to infer dimensions and orientations directly from the video feed. [0004] In the case of port-based procedures, several issues generally exclude or impair the ability to perform port-based navigation in an intraoperative setting. For example, the position of the access door geometry relative to a typical tracking device employed by a typical navigation system is a free and uncontrolled parameter that prohibits the determination of the access door orientation. Furthermore, the limited access available due to the equipment required for the procedure makes indirect access door tracking methods impractical and impractical. Furthermore, the requirement for the manipulation of the gateway intraoperatively to access many areas within the brain during a procedure makes tracking the spatial position and pose of the gateway a difficult and challenging problem that has not yet been solved before surgery. present revelation. Thus, there is a need to consider the use of an intelligent positioning system to assist in port-based intracranial medical procedures and surgical navigation. SUMMARY [0005] A computer-implemented method of adaptively and interoperationally configuring an automated arm used during a medical procedure, wherein the method comprises: [0006] identify a position and an orientation for a target in a predetermined coordinate frame; [0007] obtain a position and orientation for an end effector on the automated arm, where the position and orientation are set in the predetermined coordinate frame; obtain a desired retracted distance and a desired orientation between the target and the end effector; [0008] determine a new desired position and new desired orientation for the end effector from the target position and orientation and the desired retracted distance and desired orientation; and [0009] move the end effector to the new position and orientation. [0010] The end effector can be an imaging device that has a longitudinal axis. The target may be a surgical port that has a longitudinal axis. The desired orientation may be such that the longitudinal axis of the imaging device is collinear with the longitudinal axis of the surgical port. [0011] The imaging device may be an external videoscope. [0012] The desired retracted distance can be between 10cm and 80cm. [0013] Alternatively, the desired retracted distance can be obtained from a predetermined list. The default list can be related to specific users. The retracted distance can be increased or decreased in response to a user command. The user command can be received from one of a footswitch, a voice command and a gesture. [0014] The method can include a user who moves the end effector to a position and sets a distance between the end effector and the target as the desired retracted distance. [0015] The target can be moved during the medical procedure and the method may include identifying an updated position and orientation of the target, determining an updated new position and orientation for the end effector and moving the end effector to the new updated position and orientation. [0016] The updated position and orientation of the target can be obtained continuously and the new updated position and orientation can be determined continuously. [0017] The end effector can be moved to the new-updated position and orientation in response to a signal from the user. User signal can be received from a footswitch. The user signal can be one of a voice command and a gesture. The end effector can be moved to the new desired position and orientation in response to predetermined parameters. The default parameters can be that the target has not moved for more than a particular period of time. The particular time period can be from 15 to 25 seconds. The particular time period can be defined by a user. The predetermined parameters can be that the orientation can be out of co-axial by more than a predetermined number of degrees. The predetermined number of degrees can be defined by a user. The target can be a door and the default parameters can be less than a predetermined percentage of the total door's field of view. The default percentage can be defined by a user. [0018] An intelligent positioning system for adaptive and interoperability positioning and end-effector relative to a target during a medical procedure, which includes: an automated arm assembly that includes an articulated arm that has a distal end connectable to the end-effector; a detection system for detecting a target position; a control system and associated user interface operably connected to the automated arm assembly and operably connected to the detection system, wherein the control system is configured to: identify a position and orientation for a target in a predetermined coordinate frame; obtaining a position and orientation for an end effector in the automated arm assembly, wherein the position and orientation are defined in the predetermined coordinate frame; obtain a desired retracted distance and a desired orientation between the target and the end effector; determining a new position and orientation for the end effector from the position and orientation of the target and the desired retracted distance and the desired orientation; and move the end effector to the new position and orientation. [0019] The system can include a visual display and images from the imaging device can be displayed on the visual display. [0020] An automated arm assembly for use with an end effector, a target and a detection system may be for use during a medical procedure, wherein the automated arm assembly includes: a base frame; an articulated arm operably connected to the base frame and having a distal end detachably connectable to the end effector; a weight operably connected to the base frame that provides a counterweight to the articulated arm; and a control system operably connected to the articulated arm and detection system which provides information related to a position of the target and the control system determines a new position and orientation for the distal end of the articulated arm with respect to a position of the target; and by which the distal end of the articulated arm can be moved in response to information from the control system. [0021] The automated arm assembly can include a turret connected to the base frame and extending above it, the articulated arm being connected to the turret and extending out of the same. The arm can be movable up and down the tower. The automated arm assembly may include a support beam with one movable end connected to the tower and the other end to the automated arm. The articulated arm can have at least six degrees of freedom. Automated arm assembly can be moved manually. The base frame may include wheels. [0022] The end effector can be traced using the detection system. The articulated arm can include tracking markers that are tracked using the detection system. The automated arm assembly may include a radial arrangement connected to the distal end of the articulated arm, and the end effector may be movable, connected to the radial arrangement, whereby the end effector moves along the radial arrangement in response to input from the control system. [0023] The automated arm assembly may include a joystick operably connected to the control system, and the movement of the articulated arm may be controllable by the joystick. [0024] The end effector can be one of an external video scope, an abrasion laser, a grip, an insertable probe, or a micromanipulator. The end effector may be a first end effector and additionally include a pluggable second end effector near the distal end of the articulated arm. The second end effector can be a wide-angle camera. [0025] The control system can restrict the movement of the articulated arm based on preset parameters. Preset parameters can include space above the patient, space from the floor, maintain surgeon line of sight, maintain tracking camera line of sight, mechanical arm uniqueness, auto-collision avoidance, patient collision avoidance, base orientation , and a combination thereof. [0026] The automated arm assembly can include a dome protector connected to the articulated arm, and the distal end of the articulated arm can be restricted to move only within the protective dome. A virtual safety zone can be defined by the control system and the distal end of the articulated arm can be restricted to move only within the safety zone. [0027] An alignment tool for use with a surgical port that includes: a tip for insertion into the surgical port; and a generally tapered portion at the distal end of the tip and connected so that the tapered portion can be pulled away from the port end, when the tip can be fully inserted into the portion. The conical portion can be produced from a plurality of circular annotations. [0028] In some embodiments, intelligent positioning systems (and associated methods) to support gateway-based procedures are disclosed, which include the following components: one or more imaging devices; an external guided and tracked automated arm configured to support one or more of the imaging devices; one or more tracking mechanisms or devices; one or more tracked marker assemblies or tracked markers; a navigation system configured to accept preoperative and/or intraoperative data; and an intelligent positioning system to control the pose and position of the automated arm. [0029] In some embodiments, a software system is provided which includes a user interface for performing surgical procedures, wherein the user interface includes viewing and image processing based on tracked devices and intracranial images (optionally pre-scanned). operatively and intraoperatively). The combined result is an efficient imaging and surgical intervention system that maintains the surgeon in a preferred state (eg, a line of sight, bimanual manipulation) that is suitable or customized to perform surgery most effectively. [0030] In some embodiments, as described below, the access port may be employed to provide an optical view path to an imaging device. The imaging device acquires a high resolution image of the surgical area of interest and provides a means for the surgeon to visualize that surgical area of interest using a monitor that displays said image. The image can also be video or image transmission. [0031] In some embodiments, a system is provided which includes an intelligent positioning system, which is interfaced with the navigation system to position and align one or more imaging devices, relative to (and/or within) a access door. In order to achieve automated alignment, tracking devices can be employed to provide spatial positioning and pose information in a common coordinate frame at the access port, with the imaging device, automated arm and optionally other surgically relevant elements such as surgical instruments inside the operating room. The intelligent positioning system can provide a mechanically strong mounting position configuration for a port-based imaging sensor, and can allow the integration of preoperative images in a way that is useful to the surgeon. A further understanding of the advantageous and functional aspects of the disclosure can be made by referring to the drawings and description detailed below. BRIEF DESCRIPTION OF THE DRAWINGS [0032] The modalities will now be described, by way of example only, with reference to the drawings, in which: [0033] Figure 1 is an exemplary embodiment illustrating system components of an exemplary surgical system used in port-based surgery. [0034] Figure 2 is an exemplary modality that illustrates various detailed aspects of a port-based surgery, as seen in Figure 1. [0035] Figure 3 is an exemplary embodiment illustrating system components of an exemplary navigation system. [0036] Figures 4A to 4E are exemplary embodiments of various components in an intelligent positioning system 4B. [0037] Figures 5A to 5B are exemplary embodiments of an intelligent positioning system that includes a lifting column. [0038] Figures 6A to 6C are exemplary embodiments that illustrate the alignment of an imaging sensor with a target (gate). [0039] Figure 7 is an exemplary embodiment of an alignment sequence implemented by the intelligent positioning system. [0040] Figure 8A is a flowchart depicting the sequence involved in aligning an automated arm with a target. [0041] Figure 8B is a flowchart depicting the sequence involved in aligning an automated arm with a target. [0042] Figure 9A is a flowchart depicting the sequence involved in aligning an automated arm with a target. [0043] Figure 9B is an illustration depicting a visual indicator system to assist a user in manual alignment of an automated arm. [0044] Figure 10A to 10B is an illustration depicting tool features that can be used in optical detection methods. [0045] Figure 11 is the flowchart that describes the sequence involved in a modality to determine the zero position and the desired position of the end effector. [0046] Figures 12A to 12B are illustrative alignment of exemplary embodiments of an access door, in multiple views. [0047] Figure 13 is an illustration depicting gate characteristics that can be used in optical detection methods. [0048] Figures 14A to 14B are block diagrams showing an exemplary navigation system that includes an intelligent positioning system. [0049] Figure 15 is a flowchart depicting the steps of a port-based surgical procedure. [0050] Figures 16A to 16D are exemplary embodiments illustrating an introducer port during cannulation in the brain. DETAILED DESCRIPTION [0051] Various modalities and aspects of disclosure will be described, with reference to the details discussed below. The following description and drawings are illustrative of the disclosure and should not be construed as limiting the disclosure. Numerous specific details are described to provide a full understanding of the various embodiments of the present disclosure. However, in certain examples, conventional or well-known details are not described, in order to provide a concise discussion of embodiments of the present disclosure. [0052] As used herein, the terms “comprises” and “comprises” shall be construed as being inclusive and without limitation, and not exclusive. Specifically, when used in the specification and claims, the terms, “comprises” and “comprises” and variations thereof mean that specified features, steps or components are included. These terms should not be interpreted as excluding the presence of other features, steps or components. [0053] As used herein, the term "exemplary" means "which serves as an example, occurrence, or illustration," and should not be interpreted as preferable or advantageous over the other embodiments disclosed herein. [0054] As used herein, the terms "about" and "approximately" are intended to cover variations that may exist at the upper and lower limits of the ranges, such as variations in properties, parameters, and dimensions. In a non-limiting example, the terms "about" and "approximately" mean plus or minus 10 percent or less. [0055] As used herein, the term "Navigation System" refers to a surgical operating platform that includes within it an intelligent positioning system, as described within this document. [0056] As used herein, the term “Imaging Sensor” refers to an imaging system which may or may not include within it a source of illumination to acquire the images. [0057] As used herein, the term “tracking system” refers to a recording apparatus that includes an operating platform that may be included as part of or independent of the intelligent positioning system. [0058] Various embodiments of the present disclosure seek to address the aforementioned inadequacies of existing devices and methods to support port-based surgical procedures. [0059] Minimally invasive brain surgery using access ports is a newly conceived method of performing surgery on brain tumors previously considered inoperable. An object of the present invention is to provide a system and method to aid in minimally invasive port-based brain surgery. To address intracranial surgical interests, specific products, such as the NICO BrainPath™ port, have been developed for port-based surgery. As seen in Figure 16A, port 100 comprises a cylindrical assembly formed from an outer shell. The port 100 may favor the introducer 1600, which is an internal cylinder that slips into engagement with the inner surface of the port 100. The introducer 1600 may have a distal end in the form of a conical atraumatic tip to allow insertion into the folds of the grooves. 1630 of the brain. Port 100 is of sufficient diameter to allow manual manipulation of traditional surgical instruments, such as suction devices, scissors, scalpels, and cutting devices, as examples. Figure 16B shows an exemplary embodiment in which surgical instrument 1612 is inserted below port 100. [0060] Figure 1 is a diagram illustrating components of an exemplary surgical system used in port-based surgery. Figure 1 illustrates a navigation system 200 having an equipment turret 101, a tracking system 113, a display 111, an intelligent positioning system 250 and tracking markers 206 used to track instruments or an access door 100. tracking system 113 may also be considered an optical tracking device or a tracking camera. [0061] In Figure 1, a surgeon 201 performs an extraction of a tumor through a port 100, using an imaging device 104 to view the port at sufficient magnification to allow sharp visibility of instruments and tissue. The imaging device 104 may be an external endoscope, video endoscope, wide field camera, or an alternate image capture device. The image sensor view is represented on the visual display 111, which the surgeon 201 uses to navigate the distal end of the port through the anatomical region of interest. [0062] An intelligent positioning system 250 comprising an automated arm 102, a lifting column 115 and an end effector 104, is disposed close to the patient 202. The lifting column 115 is connected to an intelligent positioning system frame 250. As seen in Figure 1, the proximal end of an automated mechanical arm 102 (also known as an automated arm herein) is connected to the lifting column 115. In other embodiments, an automated arm 102 may be connected to a horizontal rod 511 as seen in Figure 5A, which is then either connected to the lifting column 115 or to the intelligent positioning system frame 250 directly. Automated arm 102 may have multiple articulations to allow for 5, 6, or 7 degrees of freedom. [0063] The end effector 104 is connected to the distal end of the automated arm 102. The end effector 104 can accommodate a plurality of instruments or tools that can assist the surgeon 201 in the procedure. End effector 104 is shown as an external endoscope, however it should be noted that this is merely an exemplary embodiment and alternative devices may be used as end effector 104 such as a wide field camera 256 (shown in Figure 2, microscope and OCT (Optical Coherence Tomography) or other imaging instruments. In an alternative embodiment, multiple end effectors may be connected to the distal end of automated arm 102 and thereby assist the surgeon in switching between multiple modalities. For example, the surgeon may want the ability to move between microscope and OCT with offset optics. In an additional example, the ability to connect a more accurate second, but shorter range end effector such as a microcontroller laser-based ablation system can be contemplated. [0064] The intelligent positioning system 250 receives input to the spatial position and pose data from the automated arm 102 and the target (e.g., port 100) as determined by the tracking system 113 by detecting the tracking markers 246 in the tracking camera. wide field 256 at port 100 as shown in Figure 2. Additionally, it should be noted that tracking markers 246 can be used to track both the automated arm 102 as well as the end effector 104 either collectively (together) or independently. It should be noted that the widefield camera 256 is shown in this image and that it is connected to the external endoscope 266 and the two imaging devices together form the end effector 104. It should further be noted that although these are represented together for illustration of the diagram that both can be used independently of each other, for example as shown in Figure 5A where an external video scope 521 is represented independently of the wide field camera. [0065] The intelligent positioning system 250 computes the desired joint positions for the automated arm 102 in order to maneuver the end effector 104 mounted on the distal end of the automated arm into a predetermined spatial position and a pose with respect to the port 100. predetermined relative space and pose are termed "Zero position" and are described in more detail below and shown in Figure 6A-B where the imaging sensor and port are axially aligned 675 with a linear line of sight. [0066] Additionally, the intelligent positioning system 250, the optical tracking device 113, the automated arm 102 and the tracking markers 246 and 206 form a feedback loop. This feedback loop works to keep the distal end of the port (located within the brain) in constant view and focus of the end effector 104 as it is an imaging device as the port position can be dynamically manipulated by the surgeon during the procedure. Intelligent positioning system 250 may also include a foot pedal 155 for use by the surgeon 201 to align the end effector 104 (i.e., a video endoscope) of automated arm 102 with port 100. Foot pedal 155 is also found in Figure 5A , 5C and 7. [0067] Figure 3 is a diagram illustrating system components of an exemplary port-based surgery navigation system. In Figure 3, the main components to minimally support invasive port-based surgery are shown as separate units. Figure 1 shows an exemplary system that includes a monitor 111 for displaying a video image, a tower of optical equipment 101, which provides a light source, camera electronics and video storage equipment, an automated arm 102, which holds an imaging sensor 104. A patient's brain is held in place by a headrest 117 and inserted into the head is an access port 100 and an introducer 1600 as shown in Figure 16A. The introducer 1600 may be replaced with a tracking probe (with a tracking marker attached 116) or a medically relevant instrument such as 1612 used for port-based surgery. The introducer 1600 is tracked using a tracking system 113, which provides position and guidance information for tracked devices to the intelligent positioning system 250. [0068] An example of the surgeon dynamically manipulating port 100 is shown in Figure 16D. In Figure 16C-D, a port-based tumor extraction is performed on brain 1640. The surgeon 201 will typically maneuver port 100 to actively seek and provide access to as much of the tumor 120 or equivalently unhealthy tissue as possible in order to to extract using a medical instrument 1612. In Figure 16C there is a section of tumor 1680 that is not accessible given the positioning of port 100. In order to extract that section of tumor 1680, surgeon 201 maneuvers port 100 through a turn as shown by the arrows marked 1665. Now referring to Figure 16D this maneuver of the port 100 allows the surgeon 201 to access the previously inaccessible section 1680 of the tumor 120 in order to extract it using the medical instrument 1612. ARM DESCRIPTION [0069] The method according to the invention described in the present document is suitable both for a single automated arm of a multi-arm automated system and for the aforementioned single-arm automated system. The gain in valuable operating time, shorter anesthesia time and simpler device operation are the direct consequences of the system according to an exemplary version of the invention as shown in Figure 1. [0070] Figures 4B and 4C illustrate alternative exemplary embodiments of automated arms. In Figure 4B the distal end 408 is positioned using an extended automated arm 102 that extends over the surgeon 201. The base 428 of that arm 102 can be positioned away from the patient 202 to provide clear access to the patient 202 resting on the stretcher. surgical. Base 428 can be equipped with swivel wheel 458 to facilitate mobility in the operating room. A counterweight 438 can be provided to mechanically balance the system and minimize the load on the actuators (this weight serves the same function as the 532 weight in Figure 5B). Distal end 408 can be arbitrarily positioned due to the presence of a redundant number of degrees of freedom. Hinges such as pivot base 418 in Figure 4B and hinge 448 provide these degrees of freedom. The imaging device 104 can be connected to the end joint or equivalently to the distal end 408. [0071] Figure 4C illustrates another embodiment in which a commercially available arm 102 can be used. Again, the hinges 448 provide a redundant number of degrees of freedom to aid in easy movement of the distal end 408. In another embodiment, the distal end may have connectors that can rigidly hold an imaging device while facilitating easy removal of the device for interchangeability. with other imaging devices. [0072] Figure 4D illustrates an alternative embodiment in which a radial arrangement 499 is employed for the distal end. This arrangement allows the end effector to slide along the curved segment 499 to provide a unique degree of freedom. [0073] It should be noted that while Figures 4B and Cillustram a project that remains on the ground, this modality is not intended to limit the scope of the revelation and it should be noted that other configurations may be employed. For example, alternative exemplary configurations include a structure that is supported by the operating room ceiling; a structure extending from a tower intended to house imaging instrumentation; and rigidly connect the base of the automated arm to the operating table. [0074] In some embodiments, multiple arms can be used simultaneously for a procedure and navigated from a single system. In such an embodiment, each distal end can be tracked separately so that the orientation and location of the devices are known by the intelligent positioning system and the position and/or orientation of the mounted distal end devices can be controlled by the actuation of the individual automated arms. based on feedback from the tracking system. This tracking can be accomplished using any of the previously disclosed methods and devices. [0075] In an alternative embodiment, the patient's head may be supported in a compatible manner by a second automated arm instead of a rigid frame 117 illustrated in Figure 1. The automated head support arm may be equipped with sensing actuators. force that provide signals that allow tracking of small head movements. This sensed head position can be provided as feedback to control the relative position of the first automated arm and the corresponding position of the distal end used to mount the device (such as an imaging sensor). This coupling of the mount that holds the head and the imaging system can help reduce motion artifacts while providing patient comfort. Patient comfort will be enhanced due to the elimination of sharp points used in traditional head immobilization systems. [0076] In current surgical procedures, operating room space available around the patient being operated on is a scarce item due to the many personnel and devices required to perform the surgery. Therefore, the space required by the device around the surgical bed being minimized is ideal. [0077] In one embodiment, the space required by the automated arm can be minimized comparable to currently used surgical arms through the use of a cantilever design. This design element allows the arm to be suspended over the patient freeing up space around the patient where most automated arms currently occupy during surgical procedures. Figure 5(a) shows such a cantilevered arm 511, where the arm anchor is a heavy base 512. This allows the arm to be suspended with minimized risk of tipping over as the heavy base deflects the arm. [0078] In another embodiment, the space required by the automated arm can be minimized comparable to currently used surgical arms through the use of a concentrated counterweight 532 connected to the base of the automated arm 512, which raises a small footprint not only in the dimension of height of the same, but as well as the area of the floor in which it occupies. It should be noted that the reduction in area used in the height direction is space that can be occupied by other devices or instruments in the OU such as a surgical tool table. In addition, the smaller area required by the base of this automated arm can allow for less restricted movement of staff around the patient as well as more supplemental device and instruments to be used. Figure 5B shows such a base that uses minimal space and has a concentrated weight 532. The automated arm in this example is held at a particular height by a lifting column 115, as the design requires minimal space. In addition, some alternative embodiments that can be used for the lifting column 115 include a 4-bar arm, a scissor lift, and pneumatic pistons. TRACKING [0079] In one embodiment, as illustrated in Figure 2 and Figure 4E, tracking markers 206 may be fitted to port 100. The spatial position and pose of the port (target) are determined using the tracking markers 206 and are, then detected by the tracking device 113 shown in Figure 1 and recorded within a common coordinate frame. From the spatial position and pose of gate 100 (target), the desired position of end effector 104 and automated arm 102 can be determined. As shown in Figure 7, the lifting column 115 can raise or lower the automated arm 102 from an actual position 700 to a desired position 710. For this purpose, it is possible, for example, for tracking markers 246 located in a set , as shown in Figure 2, are fitted to the automated arm 102, so that its spatial position and pose in the operating room (OU) can thereby be determined by the tracking device 113 and the intelligent positioning system 250 Additionally, the spatial position of the automated arms and pose can also be determined using position encoders located on the arm that allow coding of joint angles. These angles combined with the lengths of the respective arm segments can be used to infer the spatial position and pose of the end effector 104 or, equivalently, the imaging sensor (e.g., the exoscope 521 shown in Figure 5A) relative to the base 512 of intelligent positioning system 250. Given the base spatial position of automated arms 512 and pose is recorded in the common coordinate frame. [0080] In one embodiment, passive tracking markers such as the reflective spherical markers 206 shown in Figure 2 are viewed by the tracking device 113 to generate identifiable points to spatially locate and determine the pose of a tracked object (e.g., a port 100 or external endoscope 521) to which tracking markers are connected. [0081] As seen in Figure 4E, a medical instrument (target) such as port 100 can be tracked by a unique, connected marker set 465 which is used to identify the corresponding medical instrument including its spatial position and pose as well as the 3D volume representation of the same in a navigation system 200, within the common coordinate frame. In Figure 4E the port 100 is rigidly connected to the tracking marker set 465 which is used to determine its spatial position and pose in 3D. [0082] Typically, a minimum of 3 beads are placed on a tracked medical instrument or object to define the same. In the exemplary embodiment of Figure 4E, 4 beads are used to track the target object (gate). [0083] The navigation system typically uses a tracking system. The location of tracking markers is based, for example, on at least three tracking markers 206 that are statically arranged on the target (e.g., port 100) as shown in Figure 2 outside the patient's body 202 or connected to the same. A tracking device 113 as shown in Figure 1 detects the tracking markers 206 and determines their spatial position and pose in the operating room which is then recorded in the common coordinate frame and subsequently stored by the navigation system. [0084] An advantageous feature of an optical tracking device is the selection of markers that can be targeted very easily and therefore detected by the tracking device. For example, infrared (IR) reflection markers and an IR light source can be used. Such an apparatus is known, for example, from tracking devices such as the "Polaris" system available from Northern Digital Inc. In a further embodiment, the spatial position of the gate (target) 100 and the position of the automated arm 102 are determined by optical detection using the tracking device. Once optical detection takes place, the spatial markers are made optically visible by the device and their spatial position and pose is transmitted to the intelligent positioning system and other components of the navigation system. [0085] In a preferred embodiment, the navigation system, or equivalently the intelligent positioning system, may utilize reflective sphere markers 206 as shown in Figure 4E in combination with a tracking device, to determine spatial positioning of medical instruments within of the operating room. The differentiation of tool and target types and the corresponding geometrically accurate virtual volumes thereof can be determined by the unique individual specific orientation of the reflective spheres relative to each other in a 445 marker set. This will provide each virtual object with an individual identity within it. of the navigation system. These individual identifiers will provide information to the navigation system such as the size and virtual shape of instruments within the system in relation to the location of their respective marker sets. The identifier can also provide information such as the tool center point, tool center geometry axis, etc. The virtual medical instrument may also be determinable from a database of medical instruments provided to the navigation system. [0086] Other types of tracking markers that can be used would be RF, EM, LED (pulsed and non-pulsed), glass beads, reflective stickers, unique patterns and structures, where RF and EM will have specific signatures for the specific tools that would be connected. Reflective stickers, structures and patterns, glass beads, and LEDs can all be detected using optical detectors, while RF and EM can be acquired using antennas. The advantages of using EM and RF tags will include line-of-sight removal during operation, where using an optical system removes additional noise from electrical detection and emission systems. [0087] In an additional embodiment, printed 3-Dou design markers can be used for detection by the imaging sensor as long as it has an inclusive field of view of the scanned medical instruments. Printed markers can also be used as a calibration standard to provide (3-D) distance information to the imaging sensor. These identification markers can include designs such as concentric circles with different ring spacing, and/or different types of bar codes. Additionally, in addition to using markers, the contours of known objects (i.e. port side) can be made recognizable by optical imaging devices through the tracking system as described in the document [Monocular Model-Based 3D Tracking of Rigid Objects: A Survey]. In an additional embodiment, reflective spheres, or other suitable active or passive tracking markers, can be oriented in multiple planes to expand the range of orientations that would be visible to the camera. [0088] In an embodiment illustrating a port used in neurosurgery, as described above, it is shown by way of example in Figure 16B, which shows an access port 100 that has been inserted into the brain, using an introducer 1600, as previously described. In the illustration shown in Figure 16B, the introducer has been removed. The same access port 100 shown in Figure 4E includes a plurality of tracking elements 206 as part of a tracking marker assembly 465. The tracking marker assembly is comprised of a rigid structure 445 for supporting the attachment of a plurality of tracking elements 206. Tracking markers 206 may be of any suitable shape to enable tracking, as listed above. In some embodiments, assembly 465 may be connected to gateway 100, or integrated as part of gateway 100. It should be understood that the orientation of tracking markers may be selected to provide adequate tracking over a wide range of relative parameters. medical instrument positional orientations and poses, and relative imaging sensor positional orientations and poses. SECURITY SYSTEM [0089] A challenge with automated movement in a potentially crowded space, such as the operating room, can be the accidental collision of any part of the automated arm with members of the surgical team or the patient. In some embodiments, this may be avoided by partially enclosing the distal end 408 within a transparent or translucent dome shield 645 as shown in Figure 6A which is intended to prevent accidental contact of the end effector 104 or, equivalently, the 521 imaging with surgical team members or the patient. [0090] In an alternative modality the protective dome can be realized in a virtual way through the use of proximity sensors. Consequently, a physical dome may be absent, but a safety zone 655 around distal end 408, as shown in Figures 6B and 6C, may be established. In one embodiment, this can be achieved through the use of proximity sensor technologies to prevent accidental contact between surgical team members and any moving part of the image sensor-mounted automated arm. An additional embodiment may include a collision sensor to ensure that the moving automated arm does not collide with any objects in the environment. This can be implemented using electrical current sensors, force or velocity sensors, and/or defined spatial boundaries of the automated arm. [0091] It should be noted that the safety systems described above are exemplary embodiments of various safety systems that can be used in accordance with the intelligent positioning system and should not be interpreted as limiting the scope of this disclosure. In one embodiment, the intelligent positioning system has the ability to acquire the spatial position and posture of the target as well as the automated arm, as described above. In possession of this information, the intelligent positioning system can be imposed as a constraint not to position the automated arm within a safety semicircle around the target. In an additional embodiment shown in Figure 6C, a reference marker 611 may be connected to the patient immobilization frame (117) to provide a reference of the patient's spatial position and head posture, in the common coordinate frame, to the system. intelligent positioning through tracking mechanisms described above. Once the position of this reference marker is determined, a positional constraint can be applied to the automated arm, effectively defining a “no-flight zone”. Bearing in mind that the reference marker 611 has the coordinates [0092] where the “r” designates a coordinate of the reference marker and α, β, Y, are the degree of rotation, tilt, and yaw of the marker. Then, a new reference origin within the common coordinate structure can be defined by assigning the marker's spatial position to be the origin and top, left and right sides of the marker (as determined with respect to the common coordinate structure). by inference from the acquired spin, pitch, and yaw), to be the z-direction, x-direction, and y-directions with respect to the new reference origin within the common coordinate structure. Bearing in mind that the position of the end effector on the automated arm is defined in spherical coordinates, for example [0093] where “E” subscript designates an end-effector coordinate, a region can be defined in spherical coordinates that can constrain from the end-effector to an area 655 outside of which a “no-flight zone” will be defined. This can be achieved by defining an angular and radial range with respect to the reference origin that the end effector cannot cross. An example of such a range is shown as follows: [0094] where the “min” subscripts designate the minimum coordinate in the particular spherical direction that the end-effector can occupy and the subscript designates the maximum coordinate in the particular spherical direction that the end-effector can occupy. The exemplary radial and angular limit ranges are indicated by two dimensions as follows and are shown in Figure 6 C as 651 (rmin) to 621 (rmax) and 631 (c min) to 641 ((pmax), respectively. can be used to define additional restricted regions, for example, those related to the surgeon's line of sight, tracking device's line of sight to tracking markers at the end effector, and conservation of the area where the surgeon's hands will be using the tools. In reference to the port-based surgery described above, a common acceptable deviation range (eg, dotted line 661 defines the length from the reference marker to the beginning of the "flight zone" ” shown in Figure 6C) from the end effector to the target, to allow the surgeon to work comfortably is 20 cm to 40 cm (ie at that rmin = 20 cm and rmax = 40 cm). [0095] In one embodiment, a safety zone can be established around the surgical team and patient using uniquely identifiable tracking markers that are applied to the surgical team and patient. Tracking markers may be limited to the torso or dispersed over the body of the surgical team, but in sufficient numbers so that a full-body estimate of each individual can be reconstructed using these tracking markers. The torso modeling accuracy of surgical team members and patient can be further enhanced through the use of tracking markers that are uniquely coded for each individual and through the use of profile information that is known to each individual similar to the way they tracking mounts identify their corresponding medical instruments to the smart positioning system as described above. Such markers will indicate a "no-fly zone" that should not be invaded when the end effector 104 is being aligned to the access door by the intelligent positioning system. The safety zone can also be realized by defining such zones before starting the surgical process using an pointing device and capturing its position using a navigation system. [0096] In another modality, multiple cameras can be used to view the OR in 3D and track all automated arms to optimize their movement and prevent them from colliding with objects in the OR. This system that has this capability is described in the document [System Concept for Collision-Free Robot Assisted Surgery Using Real-Time Sensing”. Jorg Raczkowsky, Philip Nicolai, Bjorn Hein, and Heinz Worn. IAS 2, volume 194 of Advances in Intelligent Systems and Computing, pages 165 to 173. Springer, (2012)] [0097] Additional restrictions on the smart positioning system used in the surgical procedure include avoidance of self-collision and prevention of automated arm singularity which will be further explained as follows. The self-collision avoidances that can be implemented given the kinematics and arm size and payload are known to the intelligent positioning system. Therefore, one can monitor the level of joint encoders to determine if the arm is about to collide with itself. If a collision is imminent, the intelligent positioning system implants a movement restriction in the automated arm and all non-inertial movement is stopped. [0098] In an exemplary modality, given an automated arm with 6 degrees of freedom, the arm does not have the ability to overcome a singularity. As such, when a singularity condition is approached the intelligent positioning system implements a motion restriction in the automated arm and all non-inertial motion is stopped. In another exemplary embodiment as shown in Figure 5A, an automated arm with six degrees of freedom is provided by adding a suspension column 115. In this case, singularities can be overcome as restricted motions at a joint can be overcome with the movement of another joint. While this allows the intelligent positioning system to overcome singularities, it is more of a more difficult kinematics problem. An end-effector posture is defined by 3 translational and 3 rotational degrees of freedom; to do the inverse kinematics of a 7DOF manipulator requires inverting a 6x7 matrix, which is not unique. Therefore, while a 7 degree-of-freedom manipulator allows you to rotate singularities due to their non-singularity, it is an additional computational cost. By adding an extra constraint, such as the elbow constrained to be at a particular height, the system would allow for a single solution to be found which again would ease the computational requirement of the system. [0099] Bearing in mind that the automated arm is mobile for medical flexibility and cost-effectiveness, it instills another constraint on the intelligent positioning system. This is to ensure that the mobile base 512 is in motion or the automated arm is in motion at any given time. This is achieved by the system having a self-locking mechanism that applies brakes to the base when arm movement is required. The rationale for this restriction is that arm movement without a static base will result in motion synonymous with the base (basic physics). If the boom is mounted on a vertical lift column, the lift column adds to this constraint set: the lift column cannot be activated if the mobile base wheels are not braked or if the boom is in motion. Similarly, the arm cannot be moved if the lift column is active. If the movable base wheel brakes are released, the wheel and column are weakened and placed in a braking state. ADVANTAGES OF THE ARM [0100] In an advantageous embodiment of the system, the automated arm with an external scope mounted will automatically move to zero position (i.e. the predetermined spatial position and posture) relative to the port (target) through the process shown in Figure 8A. When this is done during the surgical procedure, it is possible to immediately start on the patient's treatment, allowing the surgeon to skip the periodic manual step of realigning the external scope with the port. [0101] In a preferred embodiment, the chosen automated folding position will align the distal end with the mounted external scope, to provide a view of the bottom (distal end) of the port (for port-based surgery as described above). The distal end of the port is when the surgical instruments will be operating and thus when the surgical region is located. In another embodiment, this alignment (to provide a view at the bottom of the port) can be set manually by the surgeon or set automatically by the system, depending on the surgeon's preference, and is called a “zero position”. To automatically define the view, the intelligent positioning system will have a pre-defined alignment for the end effector relative to the door that it will use to align the automated arm. [0102] Referring to Figure 6A which describes the zero-preferred position of end effector 104 with respect to port 100. The relative pose of the imaging device (the outer scope 521 or wide field camera 256) is selected so as to ensure both a coaxial alignment and an offset 675 of the proximal end of the port as shown in both Figures 6A to B. More specifically, this ensures a coaxial alignment of the imaging device axis that forms, for example, a central longitudinal axis of the imaging device. imaging with the longitudinal axis of the gate (target) (such as 675 shown in Figure 6A to B) as predefined by the zero position. This is particularly suitable for cases such as the aforementioned port-based surgery method for tumor resection, as well as Decompression and Microscopic Lumbar Discectomy as it allows the port to be viewed from the optimal angle that results in the largest field of visualization for the surgeon, in which the surgeon will be manipulating his surgical instruments to perform the surgery. For example, as will be described above and illustrated in Figures 16A, 16B and 16C. A collinear alignment would provide the ideal view as the line of sight of the imaging devices is normal to the plane of the region of interest, preventing occlusion through wall ports that would occur in alternating lines of sight. SEMI-MANUAL/MANUAL AUTOMATED ARMS [0103] The exemplary embodiment of the automated arms shown in Figures 6A and 6B and described in the previous paragraph, are shown supporting an external imaging device that has tracking markers 246 attached thereto. In these Figures, a floor mounted arm is shown with a large 685 range manipulator component that positions the end effector of the automated arm (e.g. with 6 degrees of freedom) and has a smaller range of motion for the positioning system (e.g. example, with 6 degrees of freedom) mounted on the distal end 408. As shown in Figure 6A, the distal end of the automated arm 408 refers to the mechanism provided in the distal portion of the automated arm, which can support one or more end effectors 104 ( e.g. imaging sensor). The choice of the final effector would depend on the surgery that is performed. [0104] The alignment of the end effector of the automated arm is shown in Figures 6A to B. When the access door is moved, the system detects the motion and receptively resets the precise position of the automated arm to be coaxial 675 with the access door 100, as shown in Figure 6B. In an additional embodiment, the automated arm can maneuver through an arc to define a view depicting 3D imaging. There are 2 ways to accomplish this - 1) either use two 2D detectors at known positions on the arm or use a 2D detector and it swings back and forth in the view (or moves in and out). ALIGNMENT [0105] Figure 7 is a representation of an alignment sequence implemented by the smart pose system. In Figure 7, the automated arm 102 can be moved from its current position 700 to its desired position 710 with the aid of the cost minimization algorithm or, equivalently, an error minimization method via the smart pose system 250. [0106] In Figure 7, the current position 700 of the automated arm 102 is acquired continuously. The automated arm achieves the desired alignment (zero position) with the target (port 100) through movement activated by the intelligent pose system. The smart pose system 250 requires the current position 700 of the arm 102 to approximate the desired position of the arm 710 as described by the arrow 720 in Figure 7. This approximation occurs until the current arm alignment position approaches that desired alignment (position zero) within a certain tolerance. At the desired alignment 710, the automated arm 102 mounted with the imaging device 104 is then at zero position with respect to the target (gate 100). Subsequent alignment of automated arm 102 in desired position 710 with respect to port 100 can be activated continuously or on demand by surgeon 201 through use of footswitch 155. [0107] The cost minimization method applied by the smart pose system is described as follows and described in Figure 8A. In one embodiment, visual maintenance is performed in a manner in which the tracking device(s) 113 are used to provide an external control loop for pose orientation and precise spatial positioning of the distal end of the automated arm 102. Wherein the imaging device 104 can be connected. The intelligent pose system also utilizes this aperture control loop to compensate for deficiencies and unknowns in the underlying automated control systems, such as encoder inaccuracy. [0108] Figure 8A is an exemplary flowchart depicting the sequence involved in aligning an automated arm with a target using a cost minimization method. In the first step (810), the pose and spatial position of end effectors are determined, typically in the common coordinate frame, through the use of the tracking device or another method such as model matching or SIFT techniques described in more details below. In the next step (820), the desired final effector spatial pose or position is determined with the process 1150 shown in Figure 11 and described further below. [0109] The end effector pose error as used in step (830), is calculated as the difference between the present end effector spatial position and pose and the desired end effector spatial position and pose and is shown as the distance from the end effector. arrow 720 in Figure 7. An error threshold as used in step (840) is determined by both end-effector pose error requirements and automated arm limitations. Pose error can include joint resolution, power minimization, or motor life expectancy maximization. If the end-effector pose error is below the threshold, then automated arm movement is commanded and the intelligent positioning system waits for the next pose estimation cycle. If the pose error is greater than the threshold, the flowchart continues to step (850) where the end-effector error 720 is determined by the intelligent positioning system as a desired move. The final step (860) requires the intelligent positioning system to calculate the required movement of each joint of the automated arm 102 and command those movements. The system then repeats the cycle and continuously takes new pose estimates from the intelligent positioning system 250 to update the final effector spatial pose and position error estimate. ALIGNMENT FLOW CHART [0110] In one embodiment, the smart positioning system can perform automated arm alignment against the port-based surgery optimized port using the method as described by the flowchart depicted in Figure 8B. Figure 8B depicts the method deployed in the flowchart of Figure 8A in a refined version as used in the port-based surgery described above. In Figure 8B, an exemplary system diagram is shown illustrating various component interactions for tracking the access door (target) by the automated arm supporting an imaging device. Tracking and alignment can be manually enabled by the surgeon, or set to continuously tracking or various other types of automated arm alignment modes as described below in more detail. In both example modes, the operational flow can be performed as follows: [0111] 1. The tracking device(s) transmits the spatial positions and poses of the patient at the access port and end effector, analogous to step 810 in Figure 8A, to the intelligent positioning system after which they are registered to the common coordinate frame. The coordinates in this step are given for the port, the patient and the end effector as 815, 805 and 825 as shown in Figure 8B respectively. [0112] 2 If, for example, the imaging sensor is to be continuously (i.e., real-time) aligned with respect to the access port at position zero as described below (in the common coordinate frame), a new pose and position desired spatial dimensions for the final effector (mounted with the imaging sensor) that includes camera zoom and focus is calculated what is shown as step (845) in Figure 8B and is analogous to 820 in Figure 8A as described above . In one embodiment, the zero position is one that will orient the imaging device coaxially with the access port during port-based surgery as described in more detail below in the description of Figure 15. If, alternatively, the end effector is continuously aligned relative to a medical instrument, e.g. surgical tip tools 1015 and 1005 as shown in Figure 10B, the same calculations are computed to orient the imaging sensor so that the focal point is directed at the tip of the medical instrument or aligned in relative to it at a predetermined zero position (by the process described in Figure 11). [0113] 3 In the next step (855), analogous to step 850 in Figure 8A, of the process, the intelligent positioning system (using an inverse kinematics mechanism) reads the current articulation positions of the automated arm and computes positions of Deviation joint for the automated arm that would achieve the desired end-effector position and spatial pose as defined by the zero position. [0114] 4. The smart pose system then drives the joints to the desired joint angles through a motor controller (865) contained in the smart pose system, this step is analogous to step 860 in Figure 8A. Inputs to the motor controller include the articulation encoders (885) located on the automated arm as well as any torque/force sensors connected 875 (ie, to the smart pose system). It will be understood that several strategies can be used for the determination of the trajectory of the automated arm. Some examples are: straight line path of the distal extremity frame, equal joint velocity, and equal joint travel time. If the location and geometry of other equipment in the vicinity of the arm is known. [0115] 5. During the execution of the automated arm trajectory, one or more gauges, sensors or monitors (such as motor current, accelerometers and/or force gauges) can be monitored to stop the arm in the event of a collision. Other inputs to prevent a collision include proximity sensors that can provide information (835) in the proximity of the automated arm regarding obstacles in the vicinity of the automated arms as well as defined "no-fly zones" 655 depicted in Figures 6B to 6C and described. in this document. [0116] Due to the fact that the surgical area is filled with many pieces of equipment and people, it may be desirable for all gross alignment of the distal end to be performed manually and only fine alignment to be performed automatically from the tracked data. [0117] Constant realignment of an end effector with a moving target during port-based surgery is problematic to achieve as the target is moved frequently and this can result in increased danger to equipment and personnel in the OR. [0118] Motion artifacts can also induce motion sickness in surgeons who constantly visualize the system. There are multiple modalities that deal with such a problem, two of which will be described further. The first involves the intelligent positioning system that restricts the arm's movement so that it only realigns with the target if the target has been in a constant position, different from its initial position, for more than a set period of time. This can reduce the amount of movement the arm undergoes throughout an entire surgical procedure as it can restrict the movement of the automated arm to significant, non-incidental target movements. The typical duration to maintain constant target position in port-based brain surgery is 15 to 25 seconds. This period may vary for other surgical procedures although the methodology is applicable. Another modality may involve estimating the extent of occlusion of the surgical space due to port misalignment with respect to the line of sight of the video scope 104. This can be estimated using available tracking information about port orientation and video scope orientation. Alternatively, the extent of occlusion of the surgical space can be estimated using the extent of the distal end of the port that is still visible through the video scope. An exemplary limit of acceptable occlusion would be 0 to 30%. [0119] The second mode is the mode of action described in this document. Alternative problems with the constant realignment of the end effector can be caused by the target as the target may not be placed in a stationary mode where it is free from inadvertent tiny movements that the tracking system will detect. These tiny movements can cause the automated arm to do small realignments in sync with small movements of the door. These realignments can be significant as the end effector can be in realignment in a radial fashion to the gate, and from this, a small target movement can be magnified in a retracted distance (i.e., angular movements of the target at the target location). can cause large absolute movements of the automated arm located at a radial distance away from the target). A simple way to solve this problem is to have the smart positioning system only trigger arm movement if realigning the automated arm can cause the automated arm to move more than a threshold amount. For example, a move that was longer than five centimeters in any direction. CORRESPONDENCE OF MODEL AND ALIGNMENT TECHNIQUE IN SIFT [0120] An alternative method of door alignment is to use machine vision applications to determine the spatial position and position the door from the image acquired by the imaging sensor. It should be noted that these techniques (i.e. model matching and SIFT described below) can be used as inputs to step (810) in the flowchart depicted in Figure 8A and described in detail above, as opposed to the optical tracking devices described above . [0121] The mentioned methods use a model matching technique or in an alternative embodiment a SIFT Matching Technique to determine the identity, spatial position, and position of the target, in relation to the end effector mounted on the automated arm. In one embodiment, the model matching technique would work by detecting the model located on the target and inferring from its skewed, rotated, translated and scaled representation in the captured image, its spatial position and position relative to the sensor. imaging. [0122] Figures 10A and 10B are illustrations depicting target characteristics that can be used in optical detection methods. Figures 10A and 10B contain two targets, the first being a surgical pointer tool 1015 and the second a port 100, both having connected models 1025 and 1030, respectively. In an alternative detection method, the SIFT technique functions by using a known size ratio of two or more known features of a target to analyze an image taken by an imaging sensor to detect the target. For example, as shown in Figure 10A, the features could be the inner 1020 and outer 1010 contours of the door lip 100. Once the feature is identified, the SIFT technique uses the skewed, rotated, translated, and skewed representation. scale of features in the analyzed image to infer their spatial position and position in relation to the imaging sensor. Both the SIFT Matching Technique and the Model Matching Technique are described in detail by the other 3D tracking methods described in detail by the article “Monocular Model-Based 3D Tracking of Rigid Objects: A Survey”. It should be noted that other 3D tracking methods can be used to determine the identity, spatial position and position of a target in relation to an imaging sensor by analyzing the imagery obtained by the imaging sensor as described throughout the aforementioned article. “Monocular Model-Based 3D Tracking of Rigid Objects: A Survey”, section 4. MANUAL / SEMI MANUAL FLOW [0123] In further deployments of an intelligent positioning system, both manual and automatic alignment of the automated arm can be achieved using the same mechanism through the use of force sensing joints on the automated arm which could help to identify the intended direction. movement as directed by the user (most likely the surgeon and surgical team). Force sensors built into the joints can sense the intended direction (eg traction or thrust by the user (ie surgical staff or surgeon)) and then properly energize actuators connected to the joints to aid movement. This will have the distal end moved using enhanced joint motion guided by manual indication of the user's intended direction. [0124] In an additional embodiment, the spatial position and apposition of the distal end or, equivalently, the externally mounted device can be aligned in two steps. The two alignment steps of the present exemplary implementation include 1) gross alignment that can be performed by the user; 2a) fine positioning that can be performed by the user and assisted by the intelligent positioning system; and/or 2b) fine positioning which is performed by the intelligent positioning system independently. A smaller range of motion described in steps 2a) and more apparently 2b) is optionally limited by a ring or virtual barrier, so that as the system operates to align the distal end, the distal end does not move at such a rate to injure the surgeon, patient or anyone assisting in the surgery. This is accomplished by restricting the movement of the automated arm into this small ring or barrier. The ring or barrier may represent the extent of the smaller range of motion of the automated arm controlled by the intelligent positioning system. [0125] In additional modalities, the user can go beyond this range and the system can re-center to a new location through step 1 as described above, if the greater range of motion of the automated arm controlled by the intelligent positioning system is also automated. [0126] An exemplary alignment procedure is illustrated in the flowchart shown in Figure 9A within the exemplary context of an external imaging device mounted to the automated arm. In this case, a user can initially set the gross alignment joints to a neutral position (900) and rotate in close proximity to the patient (910). At this position, the intelligent positioning system computes a spatial target position and end-effector target position coordinate based on the zero position (920) that will target the imaging device coaxially (or at another zero position) with respect to to the access port 100, or, for example, the tip of the surgical pointer tools 1005 and 1015 shown in Figure 10B. [0127] In Figure 9A, the kinematic gear issues a set of preferred readings of automated arm joints to the user that will achieve the zero position within the tolerance achievable by gross alignment (922). The user can then employ these readings to manually perform the initial alignment step (925). In other embodiments, the user can choose to manually adjust the raw positioning by visual response alone, or based on a combination of visual response readings and preferred articulations. In yet another embodiment, the user can manually perform the initial alignment guided by the response coming from the system. For example, the system can provide visual and/or audible information that indicates to the user how close the system's alignment is to a pre-selected target range or region of the alignment in the common coordinate frame. The answer provided can help the user to identify a suitable rough alignment, for example by directing the user's alignment efforts. [0128] In another embodiment, the user may be able to pick up the end effector and, through a force/torque control circuit, guide the end effector into a raw alignment. This control methodology can also be applied if the surgeon wishes to reorient the external imaging device to be non-coaxial with the access port. [0129] Once the rough alignment is completed, the intelligent positioning system can be employed to perform the accurate alignment by moving the automated arm so that the imaging device is brought to exact zero position by any of the algorithms. described above and depicted in Figures 8A and 8B. The flowchart shown on the right side of Figure 9A is another exemplary embodiment depicting an automated alignment process that can be performed through the intelligent positioning system again analogous to the flowchart depicted in Figure 8A. [0130] According to the present embodiments, the alignment of the imaging device is semi-automated; actions are performed with operator intervention and feedback from the intelligent positioning system is performed to provide accurate and/or final alignment of the external device. [0131] During operator-assisted alignment, the positions and spatial poses of the imaging device are tracked, for example, by any of the aforementioned tracking methods, such as by image analysis as described above or by tracking the position of the access door and the imaging sensor with the use of reflective markers, also described above. [0132] Tracked spatial positions and poses are employed to provide feedback to the operator during the semi-automated alignment process. A variety of exemplary embodiments for providing feedback are presented below. It should be understood that these modalities are merely exemplary implementations of the feedback methods and that other methods may be employed without departing from the scope of the present modality. Additionally, these and other modalities can be used in combination or independently. [0133] In an exemplary deployment, haptic feedback can be provided on the automated arm to assist manual positioning of the external device for improved alignment. Wherein the example of haptic feedback that provides a tactile click on the automated arm to indicate the optimal alignment position. In another example, haptic feedback can be provided through magnetic or motorized brakes that increase movement resistance when the automated arm is near the desired orientation. [0134] In another modality, a small range of movement can be triggered through, for example, magnets or motors, which can trigger the positions and spatial poses of the external device to the desired alignment when it is manually positioned to a point close to the ideal position. This allows for general manual positioning with automated fine tuning. [0135] Another exemplary implementation for providing feedback includes providing an audible, tactile, or visual signal that changes relative to distance from the ideal positioning of the access door. For example, two audible signals can be provided that are shifted in time relative to a distance from the ideal position. As the imaging sensor is moved towards the ideal position, the signals converge. Well in the ideal position a significant perception of convergence is realized. Alternatively, the signal may be periodic in nature, where the frequency of the signal is dependent on the distance from the desired position. It is observed that human auditory acuity is incredibly sensitive and can be used to discriminate very precise alterations. See for example: http://phys.org/news/201 3-02-human-fourier-uncertainty-principle.html. [0136] In another exemplary deployment, visual indicators may be provided that indicate the direction and amount of movement required to move the imaging sensor into alignment. For example, this can be implemented using light sources such as LEDs positioned on the automated arm or, for example, a vector indicator on the camera's video screen display. An exemplary illustration of the vector indicator is shown in Figure 9B in which arrows 911, 921 and 931 represent visual indicators to the user performing the manual movement. In this Figure a shorter arrow 921 represents the spatial positions and poses of the imaging device that is closer to its required position compared to the longer arrow shown at 911. ZERO POSITIONING [0137] In one embodiment steps can be taken to define the positions and relative spatial poses of the automated arm (mounted with external device or an equivalent imaging device) relative to the target in the common coordinate frame, e.g. that manual placement the imaging sensor in a chosen position and spatial poses relative to the target spatial positions and poses and which sets its position for the intelligent positioning system to a zero (chosen) position relative to the door. Whose imaging sensor and consequently the automated arm must constantly return to, when driven by the surgeon or automatically by the intelligent positioning system. [0138] An exemplary embodiment for setting the zero position and determining the desired spatial positions and poses of the final effector relative to the target is shown in Figure 11. The left flowchart 1100 describes how to set the zero position and is further described as follows. The first step 1110 is to position the end effector relative to the target in the desired spatial positions and poses (manually). Once this is completed, the intelligent positioning system moves to the next step 1120 where it acquires the spatial positions and poses of the end effector in the common coordinate frame. In the same step it stores these spatial positions and poses as the coordinates in the common coordinate frame, for example, shown as follows; [0139] In which the subscript “e” denotes the coordinates of the end-effector and the variables α, β, and Y that represent bearing, pitch and yaw respectively. The next step 1130 is the same as the previous step 1120 except the process is applied to the target. Exemplary coordinates acquired for this step are shown as follows; [0140] Where the subscript “t” denotes the target coordinates. End step 1140 in the flowchart is to subtract the target coordinates from the end effector coordinates to get the “zero position” coordinate. The “zero position” coordinate is a transform that when added dynamic target coordinates during surgery can reproduce the relative position of the end effector to the target at zero position. An example of this calculation is shown as follows; [0141] Where the subscript “n” denotes the coordinates of “position zero”. [0142] The rightmost flowchart 1150 in Figure 11 describes an example of how the intelligent positioning system determines the desired position of the end effector during a surgical procedure and with the use of the “zero position” coordinate. The first step 1160 is to drive the intelligent positioning system to realign the end effector to the zero position. The next step 1170 is to acquire the spatial positions and poses of the target in the common coordinate frame. In the same step it stores the spatial positions and poses as coordinates, for example, shown as follows; [0143] The next step 1180 is to add the “zero position” coordinates to the target coordinates to obtain the “desired end effector position” coordinates. For example, as shown as follows; [0144] Where the subscript “d” denotes the coordinates of “desired position of the end effector”. The final step 1190 is to import these coordinates into the common coordinate frame to define the desired end-effector spatial positions and poses. MANUAL DOOR ALIGNMENT [0145] During an access port procedure, aligning the orientation of the access port for insertion and to ensure that the access port remains in alignment through the cannulation step (as described in greater detail below) can be a crucial part of a successful procedure. Current navigation systems provide a display to facilitate this alignment. Some navigation systems are designed to only ensure alignment to the point-of-interest surgical area regardless of trajectory, while others ensure alignment of a specific trajectory to a point-of-interest surgical area. In any case, this information is displayed on the navigation screen, unpinned from the view of the surgeon's actual medical instrument. With these systems it is usually necessary to have a second operator focused on the screen and manually request distance and orientation information for the surgeon while the surgeon looks at the instrument he is handling. [0146] In some embodiments, an alignment device is rigidly and detachably attached to the access door, and may also be employed as an alignment mechanism for use during video-based alignment. [0147] Figure 12B illustrates an exemplary deployment to align an access door based on visual feedback on imaging provided by an external imaging device aligned with the desired trajectory of interest. The conical device 1205 is rigidly and detachably attached to the access port 1230 with its tip 1225 aligned along the geometric axis of the access port with circular annotations 1215 printed at various depths. When the access port is viewed using an external imaging device with the geometry axis of the external imaging device aligned along the desired insertion path, the circular markers 1215 will appear concentric as shown in Figure 12B(iii) and ( iv). A misaligned access door will result in circular markers not appearing concentrically. An example of such misalignment is shown in Figure 12B(ii). In addition, a virtual crosshair 1265 can be displayed on a screen to assist a surgeon in coaxially aligning the access port while viewing the access port through an externally positioned imaging device. The position of the virtual sight can be based on preoperative surgical planning and can have the ideal trajectory for inserting the surgical access port to minimize trauma to the patient. [0148] Figure 12A illustrates another exemplary deployment in which two or more alignment markers 1210 are provided at different depths along the geometric axis of the access door 1230, optionally with a cross at each alignment marker. These alignment markers can be provided with diameters that increase as the distance increases relative to the imaging device, so that alignment markers are visible even if partially occluded by closer alignment markers. In this embodiment, correct alignment would be indicated by an alignment of all markers within the annotated representation of markers, as shown in Figure 12A (iii) and (iv). [0149] In an exemplary embodiment, the alignment markers can be provided with a colored border 1240 which, if visible in the imaging device feed, would indicate that the alignment is off the geometric axis, as shown in Figure 12A (ii). The video overlay includes a depth to target plane display so that the insertion distance can be viewed by the surgeon on the same screen as the target overlay and surgical field video display. FUNCTION MODES [0150] In a preferred modality, the automated arm of the intelligent positioning system will function in various ways as determined, but without limitation by the surgeon, the system, the stage of surgery, the image acquisition modality that is employed, the state of the system, the type of surgery that is performed (eg, port-based, open surgery, etc.), the security system. Furthermore, the automated arm can function in a plurality of modes which may include following mode, instrument tracking mode, cannulation mode, optimal viewing mode, actual actuation mode, field viewing mode, etc. [0151] The following is a brief summary of some of the modes mentioned above: TRACKING MODE: [0152] In tracking mode, the automated arm will follow the target in the predetermined (chosen) position and spatial pose as the target is manipulated by the surgeon (e.g. in the manner illustrated in Figure 16C to 16D and described in detail above) , whether electronically or physically. In the case where port-based surgery is commonly used for tumor resection as mentioned above, the surgeon will manipulate the port within the patient's brain as they search for tumor tissue 120 to operate on. As the port is manipulated, the automated arm mounted with the imaging device will move to consistently provide a constant field of view below the port with lighting conditions set towards tissue differentiation. This mode can be employed with restrictions to ensure that there is no arm contact with any other instrument or team including the surgeon inside the operating room by the process described in the description of Figure 6C. This restriction can be achieved with the use of proximity sensors to detect obstacles or scene analysis of images acquired for the operating room, as described below in more detail. Furthermore, the surgeon can dictate both the position and the chosen spatial pose (zero position) of the arm (including the imaging device) in relation to the target as this can be automatically determined by the system itself through image analysis and navigation. [0153] Some alternative derived modalities of the tracking mode may include the anti-flicker mode for which image sensor vibration is compensated, through the use of various methods, such as magnetic lens actuation, stabilization coils, as well as, decreasing the speed of movement of the arm. Jitter can be detected using image analysis software and algorithms as available in the industry today. An example of an anti-jitter mechanism is provided in US Patent No. 662871 1 B1: Method And Apparatus For Compensating For Jitter In A Digital Video Image, whereby in cross-tracking mode, the arm is adjusted to ensure that the position and spatial pose (zero position) of the imaging device are held constant, but the tracking motion is delayed to reduce the likelihood of unwanted motions from minor unintentional target motions (gate 100 in the case of port-based surgery). INSTRUMENT TRACKING MODE: [0154] In instrument tracking mode, the automated arm can adjust the imaging device to follow the medical instruments used by the surgeon, by centering both the focus and the field of view and any combination thereof on one instrument, on the other instrument , or in both instruments. This can be accomplished solely by identifying each tool and modeling it using specific trace marker guidelines as described above. CANULATION MODE: [0155] In cannulation mode, the automated arm adjusts the imaging device at an angle that provides an enhanced view for cannulating the brain using a port. This effectively displays a view of the depth of the port and introducer as it is inserted into the brain for the surgeon. IDEAL VIEWING MODE: [0156] Given the images captured by the imaging device, a mode for optimal viewing can be implemented where an optimal distance can be obtained and used to actuate the automated arm at a better viewing angle or illumination angle to provide a field maximized view, resolution, focus, view stability, etc. as required by the stage of surgery or surgeon preference. The determination of these angles and distances within the limitations will be provided by a control system within the intelligent positioning system. The control system can monitor light delivery and focus to the required area of interest, given the optical view (imaging provided by the imaging sensor) of the surgical site, this information can then be used in combination with the positioning system. intelligent tool to determine and adjust the scope to provide the optimal viewing position and spatial pose, which depends either on the surgeon, the stage of surgery, or the control system itself. OPERATION MODE: [0157] Additional modes are actuation modes in which, in each case, the surgeon has control of the automated arm actuation to align the imaging device to the target in a chosen spatial position and pose and at a defined distance. In this way, the surgeon can use the target (if one is a physical object) as a pointer to align the imaging device the way they want (useful for open surgery) to optimize the surgery they are administering.VIEW MODE FIELD: [0158] In the field of view mode, the automated arm in combination with the imaging device can be made to approximate a particular area in a field of view of the image displayed on the surgical monitor. The area can be sketched in the view using instruments that are in the image or through the use of a cursor controlled by a team in the operating room or a surgeon. As long as the surgeon has a means to operate the cursor. Such devices are disclosed in U.S. Patents. COMBINATION OF MODES: [0159] The above-mentioned modes and additional modes can be chosen and either performed by the surgeon or the system or any combination thereof, e.g. instrument tracking mode and optimal lighting mode can be activated when the surgeon starts using a particular tool, as noted by the system. In addition, the lighting and tracking properties of the modes can be adjusted and customized for each tool in use, the stage of surgery, or any combination thereof. The modes can also be employed individually or in any combination thereof, for example Raman mode in addition to optical viewing mode. All of the above modes can optionally be performed with custom safety systems to ensure that failures during the intraoperative procedure are minimized. OPTIMIZATION OF VIEW AT THE END OF THE DOOR [0160] In the context of an imaging device formed as a camera imaging device with a configurable lighting source, supported by the automated arm, alignment to the access door can be important for several reasons, such as, the ability to provide delivery and uniform signal light reception. In addition, autofocusing the camera at a known location at the edge of the access door may be required or beneficial. [0161] In some deployments, the present modalities can provide accurate alignment, light delivery, regional image enhancement and focus to external imaging devices while maintaining an accurate position. Automated alignment and movement can be performed in coordination with target tracking (gateway). As noted above, this can be accomplished by determining the position and/or spatial pose of a target (gateway) by a tracking method as described above, and employing feedback from the position and/or spatial pose. tracking of the external imaging device while controlling the position and/or relative pose of the external imaging device using the automated arm. [0162] In one embodiment, a directional lighting device such as a laser pointer or a collimated light source (or an illumination source associated with an imaging device supported by the automated arm) may be used for projection. OPTICAL PORT OPTIMIZATION [0163] In yet an additional embodiment, a calibration standard is located at or near the proximal end of the access port. This standard will allow the camera imaging device to automatically focus, align the orientation of the camera's lens mount, and optionally balance lighting as well as color according to stored values and individual settings. An exemplary method used to identify the particular type of port that is used is the model matching method described above. The model, 1030 shown in Figure 10A, can be used to provide the required information about port dimensions for optimal lighting and focus parameters that the imaging device can be configured to conform to. [0164] Another stage of alignment may involve the camera imaging device that focuses on the tissue depth within the access port, which is positioned at a known depth (provided the access port length is known, and given the distance of the port (based on the proximal end of the port)). The location of the distal end of the access port 100 will be in a known position with respect to the imaging sensor 104 of Figure 1 and the traced access port 100, in absolute terms, with a certain small expected deviation from the tissue surface that arches in the access port at the distal end. With a given field of view, the camera's optical zoom/focus factors, and a known distance from the detector to the edge of the access port, the focus setting can be dynamically predetermined to enable autofocus relative to the edge. of tissue based simply on access port tracking and camera location, while using a few known definitions (camera, access port length, focus optics/mechanics, desired field of view). In this way, a stable focus can be established to maximize the desired field of view. [0165] In a similar closed-loop manner, the color and white balance of the imaging device output can be determined through appropriate imaging processing methods. A significant tissue with current surgical optics is gloss, caused by fluids that reflect intense illumination in the surgical cavity. Brightness causes imbalance in the dynamic range of the camera, where the upper range of the dynamic range of detectors is saturated. Furthermore, the lighting intensity across the frequency spectrum can be unbalanced depending on lighting conditions and surgical conditions. Using a combination of features or calibration targets at the access port (100), and using predefined parameters associated with the camera and light source combination, images can be automatically optimized for color balance, white balance, dynamic range and lighting uniformity (spatial uniformity). Several published algorithms can be employed to automatically adjust these image characteristics. For example, the algorithm published by Jun-yan Huo et.al. (“Robust automatic white balance algorithm using gray color points in images,” IEEE Transactions on Consumer Electronics, Volume 52, No. 2, May 2006) can be employed to achieve automatic white balance of captured video data. Additionally, the surgical context can be used to tailor optimal imaging conditions. This will be discussed in more detail below. IMAGE OPTIMIZATION WITH TWO-STAGE METHOD [0166] Alternatively, in a two-step approach, the tracking system can be employed, in a first alignment step, to track the position of the access door, for a raw calculation of position and spatial pose. This allows an imaging device 104, as seen in Figure 1, to be positioned coaxially with the port 100, and at the appropriate focal length and focal definition in the user-defined field of view, resolution, and frame rate. This will only be accurate within the tolerance of the system's tracking capability, mechanical arm positioning accuracy, and tissue deflection at the access port tip. [0167] A second stage alignment, based on optimization and imaging focus, can optionally be achieved by imaging sensor interaction, automated arm positioning, image analysis, and the use of range detection for the end of the port gate. access (eg, by model matching), and centered on the distal end of the access port. For example, since it is done precisely with the more traditional autofocus functions of digital camera systems, the image can be analyzed to determine image sharpness by means of metric image quantification in a series of focal zones. Focal zones are targeted to a location at the end of the access door, where the raw positioning of the system allows for this precise and more focused approach to automatically detect the focal zone extending within the field of view of the end of the access door. More specifically, this is defined as a zone smaller than the gateway's field of view. [0168] In addition, one or more range detectors can be used, optionally through the lens of the imaging device 104, so that the actual position of the tissue at the end of the access port can be calculated. This information can be provided as input to the repeatable algorithm that determines the optimal position, imaging device, position and focal settings. OPTIMIZED DATA AND LIGHTING [0169] Coaxial alignment of the imaging sensor with the access port allows for effective light distribution to the end of the access port which is vital for acquiring higher resolution imaging, as well as the ability to focus optics in order to enhance or maximize detector effectiveness. For example, with a poorly aligned access door and imaging sensor, only a small fraction of the imaging sensor is used to image the area of interest, ie, the edge of the access door. Generally, only 20% of the total detector is used, although a properly aligned imaging sensor can yield 60%+ detector effectiveness. A 20% to 60% improvement in detector effectiveness yields approximately a 3x improved solution. A tuning can be established in the system to define a desired effectiveness at all times. To achieve the same, the intelligent positioning system will actuate the movement of the automated arm mounted with the imaging sensor and focus it on the distal end of the access port as it is manipulated by the surgeon to achieve the desired detector effectiveness or field of vision. HOMOGENIZED LIGHT DISTRIBUTION [0170] Another advantageous result of this modality is the distribution of homogenized light through the port to the surgical area of interest which allows for improved tissue differentiation between healthy and unhealthy brain tissue potentially reducing glare and reducing shadows falling on the fabric due to the door. For example, the intelligent positioning system can utilize light ray tracking software (such as, ZMAX) to model the system due to the constraints of the spatial position, pose, and 3D virtual model of the door, as well as the spatial position, pose, and model lighting source as shown in Figure 13. The first model 1310 shows the region of interest lighting using a single lighting element on the external imaging device at a given distance and pose relative to the door. The second 1320 and third 1330 models show illumination of the region of interest using lighting from two sources each. The font pairs in each model are oriented differently relative to the other model. Both models two and three have the same pose and distance parameters as model one relative to the door. The final model 1340 shows illumination from the two sources with the same orientation as the sources in the second model 1320 relative to the imaging device, with the same pose but a different distance. The color map in each region of interest (distal end of the port) shown in the Figure describes the level of illumination, where the mid-range 1350 represents the optimal level of illumination. [0171] As can be seen in Figure 13, hotspots 1360 exist in models one through three (1310, 1320, 1330) which results in strong glare in these positions and inadequate imaging for the surgeon, while model four 1340 provides the lighting condition ideal (homogenized and low brightness distribution of illumination). Using model four as the ideal spatial position and pose alignment of the lighting source, the automated arm would position the imaging sensor (including the light source) to achieve that particular lighting level map, thereby enhancing the view of the surgical area of interest to the surgeon. The software can then determine the optimal spatial position and pose of the lighting source (the imaging device, in this case) relative to the target (gate) due to system constraints (minimum deviation 575, as shown in Figure 6A-B). ) to ensure optimal light distribution through the port to the region of interest. The light source can also be optimally positioned after casting the shadow mold by the surgical tools. In other words, the target region within the field of view can be optimally illuminated while avoiding shadow casting of the medical instruments used by the surgeon inside the port. This is possible because the spatial position and pose of the medical instrument can be estimated using tracking markers placed on surgical tools. [0172] Now, referring to Figures 14A and 14B, a block diagram of an example system configuration is shown. The exemplary system includes processing and control system 1400 and various external components, shown below. [0173] As shown in Figure 14A, in one embodiment, the control processing system 1400 may include one or more processors 1402, a memory 1404, a system bus 1406, one or more input/output interfaces 408, a communication 1410 and storage device 1412. The control and processing system 1400 interfaces with various external devices and components, including, for example, those associated with access port imaging and tracking, namely, engine(s) 1420, device external imaging device(s) 1422, projection and lighting device(s) 1424, and automated arm 1426. Rendering of user interface and external input is facilitated by one or more displays 1430 and one or more input devices/ external 1426 outputs (such as a keyboard, mouse, footswitch, microphone and speaker). [0174] The control and processing system 1400 also interfaces with an intelligent positioning system 1440 including a tracking device 113 to track items such as an access door 100 in Figure or 1450 in Figure 14 and one or more devices or instruments 1452 Additional optional components include one or more therapeutic devices 1442 that can be controlled via the control and processing system 1400 and external storage 1444, which can be employed, for example, to store pre-operational image data, surgical plans, and other information. [0175] It should be understood that the system is not intended to be limited to the components shown in Figure 1400. One or more control and processing components 1400 may be provided as an external component that interfaces with a processing device. In an alternative embodiment, the navigation system 1440 can be directly integrated with the processing and control system 1400. [0176] The disclosure modalities may be implemented through processor 1402 and/or memory 1404. For example, the functionality described in this document may be implemented partially through hardware logic in processor 1402 and partially using stored instructions in memory 1404, as one or more engine processing engines. Exemplary processing engine engines include, but are not limited to, dynamic and static modeling engine engine 1458, user interface engine engine 1460, tracking engine engine 1462, controller engine 1464, computer vision engine engine 1466, engine motor to monitor the environment around the automated arm based on inputs from sensor 1431, image registration motor engine 1468, robotic planning motor engine 1470, inverse kinematic motor engine 1472, and imaging device controllers 1474. Exemplary processing are described in more detail below. [0177] Some embodiments can be implemented using the 1402 processor without additional instructions stored in the 1404 memory. Some embodiments can be implemented using the instructions stored in the 1404 memory for execution by one or more microprocessors of ideal purpose. Thus, disclosure is not limited to a specific hardware and/or software configuration. [0178] While some modalities can be implemented on fully functioning computers and computer systems, several modalities are capable of being distributed as a computing product in a variety of ways and have the ability to be applied independently of the particular type of computer-readable medium. or machine used to factor affect the distribution. [0179] At least some aspects revealed can be incorporated, at least in part, into the software. That is, the techniques can be performed on a computer system or other data processing system in response to its processor, such as a microprocessor, executing sequences of instructions contained in a memory, such as ROM, volatile, non-volatile RAM, cache or a remote storage device. [0180] A computer readable storage medium can be used to store software and data which, when executed by a data processing system, causes the system to perform various methods. Executable software data may be stored in a variety of locations including, for example, ROM, volatile, non-volatile RAM, and/or cache. [0181] Portions of this software and/or data may be stored on any of these storage devices. [0182] It is further verified that in some embodiments, unlike a typical automated arm that has to account for the unknown weight of material collected by the distal end, the automated arm only needs to account for the known weight of external devices (such as imaging devices). ) connected to the distal end. Then, known statics and dynamics of the complete automated arm can be shaped a priori (eg through motor mechanism 1458 of Figure 14) and this knowledge can be incorporated into the precise control of the arm during tracking. Additionally, imaging and tracking modalities can be used to provide situational awareness to the automated arm as described above. This situational awareness can be incorporated during access port tracking by the external device or arm-supported device to prevent accidental collision of the arm with obstacles in the way, such as the surgical team, other equipment in the operating room, and the patient. Situational awareness can also come from proximity sensors optionally mounted on the automated arm and/or distal end, as noted above. [0183] In one embodiment, the system is configured consistently with the block diagram shown in Figure 14B. Figure 14B is an exemplary embodiment of the illustration of an intelligent positioning system used in connection with a navigation system. The descriptions below highlight several exemplary communication paths that can be used by the entire Intelligent Positioning System (IPS). [0184] User -> Foot pedals -> Arm controller -> Positioning arm [0185] The surgeon has three distinct input pedals to control the IPS: [0186] 1. Align to Tool: Pressing this footswitch 155 shown in Figure 1 will align the scope 266 to the target (such as port 100) that is currently being tracked. Pedal 155 must be held continuously during movement to the tool point in time at which the pedal was initially lowered. The user needs to press the pedal again to realign. [0187] 2. Increase Clearance: The footswitch will increase the clearance distance 675 between the selected tool and the scope. The distal end will move at constant speed while pressed. The clearance distance can be increased until the limits of the automated arm are reached. [0188] 3. Decrease Offset: This pedal decreases the offset distance 675, at a constant speed, from the distal end and the selected tool. This movement will cease once a minimum clearance distance is reached (depending on scope and tool selected). [0189] These pedals are connected to the digital inputs on the automated arm via the 250 intelligent positioning system. The automated arm controller sends articulation level commands to the motor drives on the automated arm. [0190] These foot pedals can be upgraded to include optical control as well. [0191] User -> Touch Screen -> Ul Computer ->Arm Controller [0192] The user can interface with the robot through a touch screen monitor. These are usually done before surgery. [0193] 1. Initialize the joints: As the robot arm has only relative encoders, each joint must be moved up to 20 degrees into the system to determine its absolute position. Ul provides a boot screen where the user moves each joint until the encoders are initialized. [0194] 2. Imaging sensor selection: Imaging sensor selection in the UL computer is sent to the automated arm controller. Different imaging sensors have different masses and different desired relative spatial positions and poses relative to the target (eg the door). [0195] 3. Tracked medical instrument selection: Selection in which the target tracks (due to multiple targets, eg a port or a medical instrument or etc.) in the UL computer is sent to the automated arm controller. [0196] 4. Degree of Freedom Selection: User can select whether the tool will be tracked in 6-, 5- or 3-DoF mode. [0197] 5. Set position 0: Set a spatial position and new poses of the automated arm (and consequently the imaging sensor due to same is mounted on the automated arm) in relation to a target (eg the door) [0198] NDI Optical Tracker -> Ul Computer -> Arm Controller [0199] The NDI tracking system obtains the distal end (or, equivalently, the imaging sensor) spatial position and pose in its field of view. It sends such data to the UL Computer which shares the tracked target and distal extremity information with the automated arm controller so that the spatial position and pose can be calculated. It can also use the patient reference and record to determine a no-access zone. [0200] Situation Science Camera -> Ul Computer -> Monitor [0201] The Situation Science Camera (specific modality of an imaging sensor) provides imaging of the surgical site. This imagery is sent to Ul's computer which transforms it into a video stream that is output to an external monitor. In addition, Ul's computer can superimpose warnings, error messages, or other information for the user on the video stream. PHASES OF PORT-BASED SURGERY [0202] An exemplary port-based surgical operation phase division is shown in Figure 15. The arm can be used in a corresponding manner for each of the phases to complement and facilitate the surgeons' process during each step. [0203] • The first step (1510) is the scalp incision ecraniotomy. During such procedures the automated arm (102) (connected to the imaging device (104)) can be implanted to guide the surgeon to the correct position of the craniotomy in relation to the brain within the skull automatically. This is achievable through the use of the navigation system in conjunction with the automated arm. [0204] • Once the incision and craniotomy are completed, surgery enters the next phase (1520) and the automated arm can be used to perform a US on the dura either automatically through the system or manually through the surgical team. Using such information and inputs from the intelligent positioning system, the automated arm (with an imaging device mounted) can project the grooves into the dura to allow for better targeting of the dura incision and increased guidance science. After the dura incision, the cannulation process begins. In such a subphase, the automated arm can be adjusted to an alternate angle to provide a view of the graduation marks on the port as it is cannulated into the brain so that the surgeon can see its depth. [0205] In the next simultaneous phases (1530 and 1540), the automated arm 102 has the most utility due to the fact that it assists in providing clear images of the distal end of the port for the reduction of gross volume of unhealthy brain tissue. During this step, the surgeon 201 will maneuver the port 100 in the patient's brain 202 through multiple movements (e.g., 1665 in Figure 16C) for tumor resection (120), due to the fact that the distal end of the port, in most cases, it does not provide the necessary access for the resection of the entire tumor in one position. An example of this is shown in Figure 16C as the non-accessible part of the 1680 tumor. As the port is maneuvered, the automated arm (with the imaging device attached) can follow the port in a coaxial fashion to consistently provide a view of the distal end. (eg, as shown in Figures 6A to B) in which surgeons' tools (eg (1612)) operate, an exemplary flow of the constant alignment of the automated arm and related scope is provided in Figure 8B. This saves time for the surgeon and surgical team and simplifies the surgical process by preventing the surgical team from having to constantly readjust the imaging device to view through the port at the correct angle to provide the necessary surgical view as required in current surgical systems such as such as the Surgical Support System, UniArm (from Mitaka USA Inc.). This also increases the surgeon's accuracy by keeping the surgical site view in the same direction (relative to brain anatomy or any other reference) resulting in the surgeon remaining directionally oriented relative to the surgical site of the operation. Another way for the automated arm (as a part of the intelligent positioning system) to increase accuracy is by removing the need for the surgeon to reorient themselves with respect to space (inside the brain) when working as a result of removing their instruments. and readjusting the imaging sensor which is manually combined with an adjustable arm. In addition, the automated arm can also align the lighting device (attached to both the distal end and the imaging sensor) in orientations to provide optimal illumination to the distal end of the port. At this stage, the automated arm can also perform other alignment sequences necessary for other imaging modalities, for example stereoscopic imaging as described above for 3D imaging. Automated stereoscopic imaging can readily provide more information to the surgeon so as to again increase their accuracy during the procedure. Automated arm 102 may also provide other imaging modalities through the use of imaging probes through automated port insertion or automated external scanning, as required by the surgeon or determined by the navigation system in combination with the intelligent positioning system. [0206] • After the bulk resection phase, the surgical procedure enters the next two simultaneous fine resection phases (1550 and 1560). In this phase, the surgeon removes the tumor from the margins of healthy tissue by differentiating, using his knowledge, between healthy and unhealthy tissue. During fine resection, the automated arm is used in a similar manner to the gross volume reduction phase above. [0207] • The next phase of surgery (1570) may potentially require the automated arm to deliver therapeutic agents to the surgical site to remove any remaining unhealthy tissue from the area and ensure optimal recovery. This step can be achieved through the navigation system in combination with the smart positioning system and its automated arm maneuvering down the door to the correct location where a distal end therapeutic instrument can be used to deliver the therapeutic agents. In addition, the arm may possibly be provided with the ability to maneuver the port as needed to achieve effective delivery to all sites automatically based on inputs provided by the navigation system and/or the surgeon. [0208] • The final step (1580) involves removal of the port and wound closure in addition to the application of materials to aid healing of the surgical area. In this step, the automated arm is used in a similar way to the gross volume reduction step in that the automated maneuver of the arm through the system follows the surgeon's surgical tool to provide the necessary visualization. Once the port is removed, the automated arm is maneuvered in a similar manner to the incision step so as to provide correct visualization of the surgical area while suturing the wound. [0209] In another embodiment, the intelligent positioning system can be provided with pre-surgical information to enhance arm function. Examples of such information are a system plan that indicates the types of movements and adjustments required for each stage of surgery, as well as operating room instruments and staff positioning during the phases of surgery. This simplifies the surgical process by reducing the amount of manual and custom adjustments dictated by the surgeon throughout the procedure. Other information such as the unique weights of the imaging sensors can be entered to ensure smooth arm movement by automatically adjusting the motors used to perform the arm. SINGULARITIES [0210] The Safety Requirements of the American National Standard for Industrial Robots and Robot Systems (ANSI/RIA R15.06-1999) define a singularity as “a condition caused by alignment collinear motion of two or more robot geometry axes, resulting in unpredictable robot motion and speeds.” It is very common on robot arms that utilize a “triple roller grip”. That is, a grip around which the grip's three geometric axes, which control rotation about the y-axis, pitch, and roll, all pass through a common point. An example of a wrist singularity is when the trajectory through which the robot travels causes the first and third axis of the robot's wrist (that is, the axis of robot 4 and 6) to align. The second wrist geometry then attempts to rotate 360° at the zero time to maintain the orientation of the end effector. Another common term for this uniqueness is a “fist turn”. The result of a singularity can be quite dramatic and can have adverse effects on the robot arm, the end effector and the process. Some industrial robot manufacturers have tried to avoid the situation by slightly altering the robot's trajectory to prevent this condition. Another method is to decrease the speed of the robot's displacement path, thus reducing the speed required for the wrist to make the transition. The ANSI/RIA mandated that robot manufacturers make the user aware of singularities if they occur while the system is manually manipulated. [0211] A second type of singularity in vertically articulated six-axis robots with partitioned wrist occurs when the center of the wrist is deposited in a cylinder that is centered around axis 1 and with a radius equal to the distance between the geometry axes 1 and 4. This is called a shoulder singularity. Some robot manufacturers also mention alignment singularities, where geometry axes 1 and 6 become coincident. This is simply a subcase of shoulder singularities. When the robot passes close to a shoulder singularity, joint 1 rotates very fast. [0212] The third and final type of singularity in vertically articulated six-axis partitioned-wrist robots occurs when the wrist center is deposited in the same plane as the 2nd and 3rd geometry axes. AUTOCOLLISION AND SINGULARITY MOVEMENT INTERLOCK [0213] Making the automated arm movable instills another constraint on the intelligent positioning system, which must ensure that the movable base and the automated arm are not simultaneously in motion at any given time. This is achieved through the system which has a self-locking mechanism that applies brakes to the arm if the wheel brakes for the mobile base are not engaged. The reason for such a restriction is that arm movement without a static base will result in base movement (basic physics). If the boom is mounted on a vertical lift column, the lift column adds to such a constraint definition: the lift column cannot be activated if the movable base wheels are not braked or if the boom is in motion. Similarly, the arm cannot be moved if the lift column is active. If the mobile base wheel brakes are released, the arm and lift column are both disabled and placed in a braked state. ADDITIONAL MODE RESTRICTIONS [0214] Consider adding - moves only relative to a parameter based on the image - for example if the percentage of the image from the bottom of the door is at least a certain percentage of the total image - or some relevant parameter [0215] • axial alignment - eg moves if non-coaxial by certain degrees greater than x CLOSING STATEMENTS (NO DRAFT LIMITATIONS) [0216] Consequently, in some embodiments of the present disclosure, systems, devices and methods are described which employ imaging devices, targeting devices, tracking devices, navigation systems, software systems and surgical tools to enable a completely surgical approach. integrated and minimally invasive procedure to perform neurological and other procedures, such as previously inoperable brain tumors, in addition to the intracranial procedure using the port-based method described above. It should be understood, however, that the application of the modalities provided in this document is not intended to be limited to neurological procedures and may be extended to other medical procedures where it is desired to access tissue in a minimally invasive manner, without departing from the scope of the present revelation. Non-limiting examples of other minimally invasive procedures include colon, spinal, orthopedic, open procedures, and all single-port laparoscopic surgeries that require navigation of surgical tools through narrow cavities. The specific embodiments described above have been shown by way of example and it should be understood that such embodiments may be susceptible to various modifications and alternative forms. It is to be further understood that the claims are not intended to be limited to the particular forms disclosed, but rather to cover all modifications, equivalents and alternatives that are within the spirit and scope of that disclosure. [0217] While the teachings described herein are in conjunction with various embodiments for illustrative purposes, Applicant's teachings are not intended to be limited to such embodiments. On the contrary, Applicant's teachings described and illustrated herein embrace various alternatives, modifications and equivalents, without departing from embodiments, the general scope of which is defined in the accompanying claims.
权利要求:
Claims (20) [0001] 1. Automated arm assembly (102) for use during a medical procedure on an anatomical part, the automated arm assembly (102) characterized in that it comprises: a base frame; an articulated arm operably connected to the base and having a distal end that is detachably attachable to an end effector; an optical imaging device (104) mounted on the end effector; a weight operably connected to the base frame that provides a counterweight to the articulated arm a positioning control system operably connected to the articulated arm and which is also connectable to a surgical navigation system (200) that is configured to provide information relating to a position of a target, said control system configured to receive inputs from said surgical navigation system (200); the positioning control system configured to, based on inputs from the navigation system (20), 0) surgical: - identify a position and an orientation of a target in a frame of predetermined coordinates with respect to the anatomical part; - obtain a position and an orientation of the optical imaging device (104) mounted on the automated arm (102) being located outside and away from the anatomical part and the target, the position and orientation of the optical imaging device (104) being set in the predetermined coordinate frame; - obtaining a desired offset distance and a desired orientation between the target and the device imaging optics (104); - instructing the articulated arm to move the imaging optics (104) to the desired offset distance and desired orientation; - after target movement, determine a new desired offset distance and a new orientation between the optical imaging device (104) and the preselected portion of the target such that the preselected portion of the target is located in a field of view that of the optical imaging device (104); and - instructing the articulated arm to move the optical imaging device (104) to the new desired offset distance and desired orientation. [0002] 2. Automated arm assembly (102) according to claim 1, characterized in that it additionally includes a tower connected to the base frame and extending upwards from it, the articulated arm being connected to the tower and extends out from it. [0003] 3. Automated arm assembly (102) according to claim 2, CHARACTERIZED by the fact that the arm is movable up and down the tower. [0004] 4. Automated arm assembly (102) according to claim 1, characterized in that it additionally comprises a support beam with a movable end connected to the tower and the other end to the automated arm (102). [0005] 5. Automated arm assembly (102) according to claim 1, characterized in that the articulated arm has at least six degrees of freedom. [0006] 6. Automated arm assembly (102) according to claim 1, characterized in that the automated arm assembly (102) can be moved manually. [0007] 7. Automated arm assembly (102) according to claim 1, characterized in that the base frame additionally includes wheels. [0008] 8. Automated arm assembly (102) according to claim 1, characterized in that the end effector is tracked using the detection system. [0009] 9. Automated arm assembly (102) according to claim 1, characterized in that the articulated arm additionally includes tracking markers (206, 246) which are tracked using the detection system. [0010] 10. Automated arm assembly (102) according to claim 1, characterized in that it additionally includes a radial arrangement connected to the distal end of the articulated arm and the end effector is movably connected to the radial arrangement and thereby the end effector moves along the radial array in response to information from the control system. [0011] 11. Automated arm assembly according to claim 1, characterized in that it additionally includes a control lever operably connected to the control system and the movement of the articulated arm is controllable by the control lever. [0012] 12. Automated arm assembly (102) according to claim 1, characterized in that the end effector is one of an external video scope, an abrasion laser, a tweezers, an insertable probe or a micro manipulator. [0013] 13. Automated arm assembly (102) according to claim 1, characterized in that the end effector is a first end effector and which additionally includes a second end effector connectable near the distal end of the articulated arm. [0014] 14. Automated arm assembly (102) according to claim 13, characterized in that the second end effector is a wide-angle camera [0015] 15. Automated arm assembly (102) according to claim 1, characterized in that the control system restricts the movement of the articulated arm based on defined parameters. [0016] 16. Automated arm assembly (102) according to claim 15, characterized in that the defined parameters include space above the patient, space from the floor, maintain the surgeon's line of sight, maintain line of sight of the tracking camera, singularity arm, auto collision avoidance, patient collision avoidance, base orientation, and a combination thereof. [0017] 17. Automated arm assembly (102) according to claim 1, characterized in that it additionally includes a protective dome connected to the articulated arm and the distal end of the articulated arm has restricted movement only within the protective dome. [0018] 18. Automated arm assembly (102) according to claim 1, characterized in that a virtual safety zone is defined by the control system and the distal end of the articulated arm has restricted movement only within the safety zone. [0019] 19. Automated arm assembly (102) according to claim 1, characterized in that the control system is configured to, by determining a new desired offset position and a new desired orientation between the optical imaging device (104) ) and the preselected portion of the target, calculate a desired focus and zoom level for the optical imaging device (104) and adjust the focus and zoom level of the optical imaging device (104) to the focus and zoom level. desired zooms when the optical imaging device (104) is moved to the new desired offset position and new orientation. [0020] 20. Automated arm assembly (102) according to claim 19, characterized in that it includes a visual monitor, and in which the control system is configured to show an image of the pre-selected portion of the target at focus and level. desired zoom of the optical imaging device (104).
类似技术:
公开号 | 公开日 | 专利标题 BR112015023547B1|2022-01-18|AUTOMATED ARM ASSEMBLY FOR USE USED DURING A MEDICAL PROCEDURE ON AN ANATOMICAL PART US10588699B2|2020-03-17|Intelligent positioning system and methods therefore US11007023B2|2021-05-18|System and method of registration between devices with movable arms KR20200139197A|2020-12-11|System and method for displaying an estimated position of an instrument JP2021007756A|2021-01-28|Systems and methods for onscreen identification of instruments in teleoperational medical system US20190105118A1|2019-04-11|Structural adjustment systems and methods for a teleoperational medical system KR20170074842A|2017-06-30|System and method for integrated surgical table motion US20210228282A1|2021-07-29|Methods of guiding manual movement of medical systems CA2948719A1|2014-09-18|Intelligent positioning system and methods therefore JP2021090753A|2021-06-17|Augmented reality headset with varied opacity for navigated robotic surgery US20210402603A1|2021-12-30|Robotic medical system with collision proximity indicators JP2021115483A|2021-08-10|Pose measurement chaining for extended reality surgical navigation in visible and near-infrared spectra JP2021126519A|2021-09-02|Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery JP2021171657A|2021-11-01|Registration of surgical tool with reference array tracked by cameras of extended reality headset for assisted navigation during surgery
同族专利:
公开号 | 公开日 CA2896381C|2017-01-10| US11103279B2|2021-08-31| US11207099B2|2021-12-28| EP2967348A1|2016-01-20| SG11201507613QA|2015-10-29| CA2939262C|2017-09-12| CA2939262A1|2015-09-17| US20170273715A1|2017-09-28| CA2962015A1|2015-09-17| US20180177523A1|2018-06-28| US9668768B2|2017-06-06| BR112015023547A2|2017-07-18| US20160113728A1|2016-04-28| WO2014139023A1|2014-09-18| MY170323A|2019-07-17| EP2967348A4|2017-02-08| HK1216706A1|2016-12-02| CN105050527A|2015-11-11| CN105050527B|2018-03-27| AU2014231345A1|2015-11-05| CA2896381A1|2014-09-18| AU2014231345B2|2019-01-17|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US4609814A|1983-06-20|1986-09-02|Tokyo Kogaku Kikai Kabushiki Kaisha|Control for operation microscopes| DE69026196T2|1989-11-08|1996-09-05|George S Allen|Mechanical arm for an interactive, image-controlled, surgical system| US5279309A|1991-06-13|1994-01-18|International Business Machines Corporation|Signaling device and method for monitoring positions in a surgical operation| US5657429A|1992-08-10|1997-08-12|Computer Motion, Inc.|Automated endoscope system optimal positioning| US5524180A|1992-08-10|1996-06-04|Computer Motion, Inc.|Automated endoscope system for optimal positioning| US5876325A|1993-11-02|1999-03-02|Olympus Optical Co., Ltd.|Surgical manipulation system| GB9405299D0|1994-03-17|1994-04-27|Roke Manor Research|Improvements in or relating to video-based systems for computer assisted surgery and localisation| US5631973A|1994-05-05|1997-05-20|Sri International|Method for telemanipulation with telepresence| US6351659B1|1995-09-28|2002-02-26|Brainlab Med. Computersysteme Gmbh|Neuro-navigation system| US6006127A|1997-02-28|1999-12-21|U.S. Philips Corporation|Image-guided surgery system| EP0926998B8|1997-06-23|2004-04-14|Koninklijke Philips Electronics N.V.|Image guided surgery system| JP3929595B2|1998-03-31|2007-06-13|株式会社ニデック|Eyeglass lens processing system| US6788018B1|1999-08-03|2004-09-07|Intuitive Surgical, Inc.|Ceiling and floor mounted surgical robot set-up arms| EP1109497B1|1998-08-04|2009-05-06|Intuitive Surgical, Inc.|Manipulator positioning linkage for robotic surgery| US6468265B1|1998-11-20|2002-10-22|Intuitive Surgical, Inc.|Performing cardiac surgery without cardioplegia| US6628711B1|1999-07-02|2003-09-30|Motorola, Inc.|Method and apparatus for compensating for jitter in a digital video image| US6519359B1|1999-10-28|2003-02-11|General Electric Company|Range camera controller for acquiring 3D models| US20010025183A1|2000-02-25|2001-09-27|Ramin Shahidi|Methods and apparatuses for maintaining a trajectory in sterotaxi for tracking a target inside a body| DE10025285A1|2000-05-22|2001-12-06|Siemens Ag|Fully automatic, robot-assisted camera guidance using position sensors for laparoscopic interventions| TW516083B|2000-09-18|2003-01-01|Olympus Optical Co|Optical sensor| US6919867B2|2001-03-29|2005-07-19|Siemens Corporate Research, Inc.|Method and apparatus for augmented reality visualization| US7607440B2|2001-06-07|2009-10-27|Intuitive Surgical, Inc.|Methods and apparatus for surgical planning| WO2003032837A1|2001-10-12|2003-04-24|University Of Florida|Computer controlled guidance of a biopsy needle| US6663559B2|2001-12-14|2003-12-16|Endactive, Inc.|Interface for a variable direction of view endoscope| WO2004075768A2|2003-02-25|2004-09-10|Image-Guided Neurologics, Inc.|Fiducial marker devices, tools, and methods| US7155316B2|2002-08-13|2006-12-26|Microbotics Corporation|Microsurgical robot system| GB0222265D0|2002-09-25|2002-10-30|Imp College Innovations Ltd|Control of robotic manipulation| US7381183B2|2003-04-21|2008-06-03|Karl Storz Development Corp.|Method for capturing and displaying endoscopic maps| US7491198B2|2003-04-28|2009-02-17|Bracco Imaging S.P.A.|Computer enhanced surgical navigation imaging system | US7066879B2|2003-07-15|2006-06-27|The Trustees Of Columbia University In The City Of New York|Insertable device and system for minimal access procedure| US20050054895A1|2003-09-09|2005-03-10|Hoeg Hans David|Method for using variable direction of view endoscopy in conjunction with image guided surgical systems| EP2316328B1|2003-09-15|2012-05-09|Super Dimension Ltd.|Wrap-around holding device for use with bronchoscopes| US20050096502A1|2003-10-29|2005-05-05|Khalili Theodore M.|Robotic surgical device| DE102004011460B4|2004-03-09|2011-07-14|Siemens AG, 80333|C-arm device with weight compensation| US7097357B2|2004-06-02|2006-08-29|General Electric Company|Method and system for improved correction of registration error in a fluoroscopic image| DE102004049258B4|2004-10-04|2007-04-26|Universität Tübingen|Device, method for controlling operation-supporting medical information systems and digital storage medium| US9216015B2|2004-10-28|2015-12-22|Vycor Medical, Inc.|Apparatus and methods for performing brain surgery| US9265523B2|2011-10-24|2016-02-23|Nico Corporation|Surgical access system with navigation element and method of using same| US9186175B2|2004-10-28|2015-11-17|Nico Corporation|Surgical access assembly and method of using same| US20080109026A1|2004-10-28|2008-05-08|Strategic Technology Assessment Group|Apparatus and Methods for Performing Brain Surgery| US9161820B2|2004-10-28|2015-10-20|Nico Corporation|Surgical access assembly and method of using same| US9387010B2|2004-10-28|2016-07-12|Nico Corporation|Surgical access assembly and method of using same| US9770261B2|2004-10-28|2017-09-26|Nico Corporation|Surgical access assembly and method of using same| CA2600731A1|2005-03-11|2006-09-14|Bracco Imaging S.P.A.|Methods and apparati for surgical navigation and visualization with microscope| US8414475B2|2005-04-18|2013-04-09|M.S.T. Medical Surgery Technologies Ltd|Camera holder device and method thereof| US8004229B2|2005-05-19|2011-08-23|Intuitive Surgical Operations, Inc.|Software center and highly configurable robotic systems for surgery and other uses| US7892224B2|2005-06-01|2011-02-22|Brainlab Ag|Inverse catheter planning| US8398541B2|2006-06-06|2013-03-19|Intuitive Surgical Operations, Inc.|Interactive user interfaces for robotic minimally invasive surgical systems| US7785277B2|2005-06-23|2010-08-31|Celleration, Inc.|Removable applicator nozzle for ultrasound wound therapy device| GB2428110A|2005-07-06|2007-01-17|Armstrong Healthcare Ltd|A robot and method of registering a robot.| WO2007041383A2|2005-09-30|2007-04-12|Purdue Research Foundation|Endoscopic imaging device| US20070161876A1|2005-11-18|2007-07-12|Spectrx, Inc.|Method and apparatus for rapid detection and diagnosis of tissue abnormalities| WO2007095637A1|2006-02-16|2007-08-23|Catholic Healthcare West |System utilizing radio frequency signals for tracking and improving navigation of slender instruments during insertion into the body| CN101448467B|2006-05-19|2014-07-09|马科外科公司|Method and apparatus for controlling a haptic device| US8444631B2|2007-06-14|2013-05-21|Macdonald Dettwiler & Associates Inc|Surgical manipulator| US7892165B2|2006-10-23|2011-02-22|Hoya Corporation|Camera calibration for endoscope navigation system| DE102006061178A1|2006-12-22|2008-06-26|Siemens Ag|Medical system for carrying out and monitoring a minimal invasive intrusion, especially for treating electro-physiological diseases, has X-ray equipment and a control/evaluation unit| US8491521B2|2007-01-04|2013-07-23|Celleration, Inc.|Removable multi-channel applicator nozzle| EP2123232A4|2007-01-31|2011-02-16|Nat University Corp Hamamatsu University School Of Medicine|Device for displaying assistance information for surgical operation, method for displaying assistance information for surgical operation, and program for displaying assistance information for surgical operation| WO2008103383A1|2007-02-20|2008-08-28|Gildenberg Philip L|Videotactic and audiotactic assisted surgical methods and procedures| EP2124709B1|2007-02-23|2016-08-31|Université de Strasbourg |Flexible endoscope device with visual control| WO2008109346A1|2007-03-05|2008-09-12|University Of Pittsburgh - Of The Commonwealth System Of Higher Education|Combining tomographic images in situ with direct vision in sterile environments| US20110135569A1|2007-03-20|2011-06-09|Peak Biosciences Inc.|Method for therapeutic administration of radionucleosides| US8560118B2|2007-04-16|2013-10-15|Neuroarm Surgical Ltd.|Methods, devices, and systems for non-mechanically restricting and/or programming movement of a tool of a manipulator along a single axis| US20090003528A1|2007-06-19|2009-01-01|Sankaralingam Ramraj|Target location by tracking of imaging device| JP5485889B2|2007-08-17|2014-05-07|レニショウパブリックリミテッドカンパニー|Apparatus and method for performing phase analysis measurement| US20100198101A1|2007-09-24|2010-08-05|Xubo Song|Non-invasive location and tracking of tumors and other tissues for radiation therapy| US8073528B2|2007-09-30|2011-12-06|Intuitive Surgical Operations, Inc.|Tool tracking systems, methods and computer products for image guided surgery| US8400094B2|2007-12-21|2013-03-19|Intuitive Surgical Operations, Inc.|Robotic surgical system with patient support| US8155479B2|2008-03-28|2012-04-10|Intuitive Surgical Operations Inc.|Automated panning and digital zooming for robotic surgical systems| JP5135069B2|2008-06-12|2013-01-30|三鷹光器株式会社|Medical instrument holding arm device| US20100042020A1|2008-08-13|2010-02-18|Shmuel Ben-Ezra|Focused energy delivery apparatus method and system| CN102421349B|2009-03-10|2015-08-12|奥林巴斯医疗株式会社|Position detecting system and method for detecting position| US8439830B2|2009-03-27|2013-05-14|EndoSphere Surgical, Inc.|Cannula with integrated camera and illumination| EP2501319A1|2009-11-16|2012-09-26|Koninklijke Philips Electronics N.V.|Human-robot shared control for endoscopic assistant robot| US9259271B2|2009-11-27|2016-02-16|Mehran Anvari|Automated in-bore MR guided robotic diagnostic and therapeutic system| US8601897B2|2009-11-30|2013-12-10|GM Global Technology Operations LLC|Force limiting device and method| DE102010029275A1|2010-05-25|2011-12-01|Siemens Aktiengesellschaft|Method for moving an instrument arm of a Laparoskopierobotors in a predetermined relative position to a trocar| KR101181569B1|2010-05-25|2012-09-10|정창욱|Surgical robot system capable of implementing both of single port surgery mode and multi-port surgery mode and method for controlling same| AU2011265232B2|2010-06-11|2015-01-22|Ethicon Llc|Suture delivery tools for endoscopic and robot-assisted surgery and methods| US9516207B2|2010-06-24|2016-12-06|Marc S. Lemchen|Exam-cam robotic systems and methods| US8672837B2|2010-06-24|2014-03-18|Hansen Medical, Inc.|Methods and devices for controlling a shapeable medical device| WO2012007036A1|2010-07-14|2012-01-19|Brainlab Ag|Method and system for determining an imaging direction and calibration of an imaging apparatus| US8369483B2|2010-09-07|2013-02-05|William Eugene Campbell|Multi-resolution X-ray image capture| US8935003B2|2010-09-21|2015-01-13|Intuitive Surgical Operations|Method and system for hand presence detection in a minimally invasive surgical system| US8313608B2|2010-09-22|2012-11-20|Exelis, Inc.|Method of aligning an imaging device in an optical system| US8562520B2|2010-10-01|2013-10-22|Covidien Lp|Access port| BR112013018261A2|2011-01-20|2016-11-16|Koninkl Philips Electronics Nv|Method for determining at least one path suitable for human or animal tissue movement through an intensity data set, computer readable media, computer program product and computer system| DE102011005917A1|2011-03-22|2012-09-27|Kuka Laboratories Gmbh|Medical workplace| EP2821004B1|2011-04-18|2015-08-05|Karl Storz GmbH & Co. KG|Exoscope| US10722318B2|2011-08-24|2020-07-28|Mako Surgical Corp.|Surgical tools for selectively illuminating a surgical volume| US8961537B2|2011-08-24|2015-02-24|The Chinese University Of Hong Kong|Surgical robot with hybrid passive/active control| US9387008B2|2011-09-08|2016-07-12|Stryker European Holdings I, Llc|Axial surgical trajectory guide, and method of guiding a medical device| US20130085510A1|2011-09-30|2013-04-04|Ethicon Endo-Surgery, Inc.|Robot-mounted surgical tables| US9125556B2|2012-05-14|2015-09-08|Mazor Robotics Ltd.|Robotic guided endoscope| CA2890212A1|2012-11-09|2014-05-15|Blue Belt Technologies, Inc.|Systems and methods for navigation and control of an implant positioning device| US10022520B2|2012-12-17|2018-07-17|Nico Corporation|Surgical access system| US9668768B2|2013-03-15|2017-06-06|Synaptive Medical Inc.|Intelligent positioning system and methods therefore| WO2015135055A1|2014-03-14|2015-09-17|Synaptive Medical Inc.|System and method for projected tool trajectories for surgical navigation systems| US10675094B2|2017-07-21|2020-06-09|Globus Medical Inc.|Robot surgical platform|US10136954B2|2012-06-21|2018-11-27|Globus Medical, Inc.|Surgical tool systems and method| US11253327B2|2012-06-21|2022-02-22|Globus Medical, Inc.|Systems and methods for automatically changing an end-effector on a surgical robot| EP2627278B1|2010-10-11|2015-03-25|Ecole Polytechnique Fédérale de Lausanne |Mechanical manipulator for surgical instruments| AR083847A1|2010-11-15|2013-03-27|Novartis Ag|FC VARIANTSSILENCERS OF ANTI-CD40 ANTIBODIES| JP5715304B2|2011-07-27|2015-05-07|エコール ポリテクニーク フェデラル デ ローザンヌ (イーピーエフエル)|Mechanical remote control device for remote control| US9668768B2|2013-03-15|2017-06-06|Synaptive MedicalInc.|Intelligent positioning system and methods therefore| EP3001219B1|2013-08-20|2019-10-02|CureFab Technologies GmbH|Optical tracking| CN106029308B|2014-02-28|2019-10-29|索尼公司|Robotic arm apparatus, calibration method and computer readable storage medium| CN106061427B|2014-02-28|2020-10-27|索尼公司|Robot arm apparatus, robot arm control method, and program| US10149618B1|2014-03-12|2018-12-11|The Board Of Regents Of The University Of Texas System|Subdural electrode localization and visualization using parcellated, manipulable cerebral mesh models| EP3144711B1|2014-06-20|2019-08-07|Sony Olympus Medical Solutions Inc.|Medical observation device and medical observation system| US9731392B2|2014-08-05|2017-08-15|Ati Industrial Automation, Inc.|Robotic tool changer alignment modules| DE102014219077A1|2014-09-22|2016-03-24|Siemens Aktiengesellschaft|Mobile medical device| CA2964512C|2014-10-14|2018-04-24|Synaptive MedicalInc.|Patient reference tool| CN110584789A|2014-10-27|2019-12-20|直观外科手术操作公司|System and method for instrument interference compensation| CN107072727B|2014-10-27|2020-01-24|直观外科手术操作公司|Medical device with active brake release control| US9914211B2|2014-11-25|2018-03-13|Synaptive MedicalInc.|Hand-guided automated positioning device controller| DE102014226899A1|2014-12-23|2016-06-23|Siemens Healthcare Gmbh|A method of operating a medical robotic device and a medical robotic device| GB2533798B|2014-12-30|2018-02-28|Gen Electric|Method and system for tracking a person in a medical room| DE102015200428B3|2015-01-14|2016-03-17|Kuka Roboter Gmbh|Method for aligning a multi-axis manipulator with an input device| US9924871B2|2015-03-05|2018-03-27|Synaptive MedicalInc.|Optical coherence tomography system including a planarizing transparent material| WO2016149308A1|2015-03-17|2016-09-22|Intuitive Surgical Operations, Inc.|System and method for providing feedback during manual joint positioning| JPWO2016152987A1|2015-03-25|2018-01-18|ソニー・オリンパスメディカルソリューションズ株式会社|Medical observation apparatus, surgical observation apparatus, and medical observation system| FR3036279B1|2015-05-21|2017-06-23|Medtech Sa|NEUROSURGICAL ASSISTANCE ROBOT| US9599806B2|2015-06-09|2017-03-21|General Electric Company|System and method for autofocusing of an imaging system| JP6552642B2|2015-06-10|2019-07-31|インテュイティブ サージカル オペレーションズ, インコーポレイテッド|Orientation mapping between master and slave at misalignment| WO2017009691A1|2015-07-16|2017-01-19|Synaptive MedicalInc.|Systems and methods for adaptive multi-resolution magnetic resonance imaging| EP3326566A4|2015-07-23|2019-07-03|Olympus Corporation|Medical system and operation method therefor| EP3326510A4|2015-07-23|2019-04-17|Olympus Corporation|Manipulator and medical system| CN108024837B|2015-10-01|2021-03-16|索尼公司|Medical support arm apparatus and medical system| EP3413782A4|2015-12-07|2019-11-27|M.S.T. Medical Surgery Technologies Ltd.|Fully autonomic artificial intelligence robotic system| GB201521814D0|2015-12-10|2016-01-27|Cambridge Medical Robotics Ltd|Arm location| US10667868B2|2015-12-31|2020-06-02|Stryker Corporation|System and methods for performing surgery on a patient at a target site defined by a virtual object| US20170202626A1|2016-01-15|2017-07-20|7D Surgical Inc.|Systems and methods for displaying guidance images with spatial annotations during a guided medical procedure| GB2578422B|2016-02-25|2021-09-15|Noel Dyer Kelly|System and method for automatic muscle movement detection| EP3659540A1|2016-03-30|2020-06-03|Sony Corporation|Control device, control method, and microscope device for operation| US10168688B2|2016-04-29|2019-01-01|Taylor BRUSKY|Systems and methods for implementing a pointer-guided tracking system and a pointer-guided mechanical movable device control system| JP2019523664A|2016-05-23|2019-08-29|マコ サージカル コーポレーション|System and method for identifying and tracking physical objects during robotic surgical procedures| US10265854B2|2016-08-04|2019-04-23|Synaptive MedicalInc.|Operating room safety zone| US10687901B2|2016-08-17|2020-06-23|Synaptive MedicalInc.|Methods and systems for registration of virtual space with real space in an augmented reality system| CN109561934B|2016-08-23|2022-03-08|直观外科手术操作公司|System and method for monitoring patient motion during a medical procedure| US10993771B2|2016-09-12|2021-05-04|Synaptive Medical Inc.|Trackable apparatuses and methods| CN109310474A|2016-09-19|2019-02-05|直观外科手术操作公司|Position indicator system and correlation technique for remote-controllable arm| CN109310473A|2016-09-19|2019-02-05|直观外科手术操作公司|Base portion positioning system and associated method for controllable arm| CN106491087B|2016-10-28|2019-05-14|赵金|A kind of medical detection robot| EP3551097A1|2016-12-08|2019-10-16|Orthotaxy|Surgical system for cutting an anatomical structure according to at least one target plane| EP3579780A4|2017-02-08|2020-02-19|Intuitive Surgical Operations Inc.|Repositioning system for a remotely controllable manipulator and related methods| CN108066008B|2017-03-23|2020-05-29|深圳市罗伯医疗科技有限公司|Medical instrument control method and system for assisting operation| JP6903991B2|2017-03-27|2021-07-14|ソニーグループ株式会社|Surgical system, how to operate the surgical system and control device of the surgical system| US10917543B2|2017-04-24|2021-02-09|Alcon Inc.|Stereoscopic visualization camera and integrated robotics platform| EP3621542A1|2017-05-09|2020-03-18|Brainlab AG|Generation of augmented reality image of a medical device| US11033341B2|2017-05-10|2021-06-15|Mako Surgical Corp.|Robotic spine surgery system and methods| US11065069B2|2017-05-10|2021-07-20|Mako Surgical Corp.|Robotic spine surgery system and methods| EP3669542A1|2017-08-15|2020-06-24|Dolby Laboratories Licensing Corporation|Bit-depth efficient image processing| CA2981726C|2017-10-06|2018-12-04|Synaptive MedicalInc.|Surgical optical zoom system| CN107550569A|2017-10-16|2018-01-09|鹰利视医疗科技有限公司|A kind of vertebra minimally invasive robot| US11103268B2|2017-10-30|2021-08-31|Cilag Gmbh International|Surgical clip applier comprising adaptive firing control| US11229436B2|2017-10-30|2022-01-25|Cilag Gmbh International|Surgical system comprising a surgical tool and a surgical hub| US11141160B2|2017-10-30|2021-10-12|Cilag Gmbh International|Clip applier comprising a motor controller| US11129679B2|2017-11-14|2021-09-28|Mako Surgical Corp.|Fiber optic tracking system| CN107997822B|2017-12-06|2021-03-19|上海卓梦医疗科技有限公司|Minimally invasive surgery positioning system| CN108056756A|2017-12-09|2018-05-22|海宁神迹医疗器械有限公司|A kind of telescoping sensor type anaesthesia deepness monitoring instrument and its method of work| US11160605B2|2017-12-28|2021-11-02|Cilag Gmbh International|Surgical evacuation sensing and motor control| US11234756B2|2017-12-28|2022-02-01|Cilag Gmbh International|Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter| US10758310B2|2017-12-28|2020-09-01|Ethicon Llc|Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices| US11179208B2|2017-12-28|2021-11-23|Cilag Gmbh International|Cloud-based medical analytics for security and authentication trends and reactive measures| US11202570B2|2017-12-28|2021-12-21|Cilag Gmbh International|Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems| US11166772B2|2017-12-28|2021-11-09|Cilag Gmbh International|Surgical hub coordination of control and communication of operating room devices| US20190274716A1|2017-12-28|2019-09-12|Ethicon Llc|Determining the state of an ultrasonic end effector| US20190206551A1|2017-12-28|2019-07-04|Ethicon Llc|Spatial awareness of surgical hubs in operating rooms| US20190205001A1|2017-12-28|2019-07-04|Ethicon Llc|Sterile field interactive control displays| US11100631B2|2017-12-28|2021-08-24|Cilag Gmbh International|Use of laser light and red-green-blue coloration to determine properties of back scattered light| US11132462B2|2017-12-28|2021-09-28|Cilag Gmbh International|Data stripping method to interrogate patient records and create anonymized record| US11257589B2|2017-12-28|2022-02-22|Cilag Gmbh International|Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes| US11266468B2|2017-12-28|2022-03-08|Cilag Gmbh International|Cooperative utilization of data derived from secondary sources by intelligent surgical hubs| US11147607B2|2017-12-28|2021-10-19|Cilag Gmbh International|Bipolar combination device that automatically adjusts pressure based on energy modality| US11213359B2|2017-12-28|2022-01-04|Cilag Gmbh International|Controllers for robot-assisted surgical platforms| US11096693B2|2017-12-28|2021-08-24|Cilag Gmbh International|Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing| US11056244B2|2017-12-28|2021-07-06|Cilag Gmbh International|Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks| US11253315B2|2017-12-28|2022-02-22|Cilag Gmbh International|Increasing radio frequency to create pad-less monopolar loop| US11114199B2|2018-01-25|2021-09-07|Mako Surgical Corp.|Workflow systems and methods for enhancing collaboration between participants in a surgical procedure| US10856890B2|2018-02-02|2020-12-08|Orthosoft Ulc|Robotic surgery planar cutting systems and methods| US11259830B2|2018-03-08|2022-03-01|Cilag Gmbh International|Methods for controlling temperature in ultrasonic device| US11197668B2|2018-03-28|2021-12-14|Cilag Gmbh International|Surgical stapling assembly comprising a lockout and an exterior access orifice to permit artificial unlocking of the lockout| US11207067B2|2018-03-28|2021-12-28|Cilag Gmbh International|Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing| US11213294B2|2018-03-28|2022-01-04|Cilag Gmbh International|Surgical instrument comprising co-operating lockout features| US11166716B2|2018-03-28|2021-11-09|Cilag Gmbh International|Stapling instrument comprising a deactivatable lockout| US11219453B2|2018-03-28|2022-01-11|Cilag Gmbh International|Surgical stapling devices with cartridge compatible closure and firing lockout arrangements| US20190298350A1|2018-03-28|2019-10-03|Ethicon Llc|Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems| US11090047B2|2018-03-28|2021-08-17|Cilag Gmbh International|Surgical instrument comprising an adaptive control system| DE102018205758A1|2018-04-16|2019-10-17|Siemens Healthcare Gmbh|Medical device and method for operating a medical device| US11039894B2|2018-04-20|2021-06-22|Verb Surgical Inc.|Robotic port placement guide and method of use| DE102018206406B3|2018-04-25|2019-09-12|Carl Zeiss Meditec Ag|Microscopy system and method for operating a microscopy system| WO2019210322A1|2018-04-27|2019-10-31|Truevision Systems, Inc.|Stereoscopic visualization camera and integrated robotics platform| EP3569159A1|2018-05-14|2019-11-20|Orthotaxy|Surgical system for cutting an anatomical structure according to at least one target plane| KR20210057724A|2018-08-01|2021-05-21|브레인 나비 바이오테크놀러지 씨오., 엘티디.|Method and system for tracking patient position during surgery| US11197728B2|2018-09-17|2021-12-14|Auris Health, Inc.|Systems and methods for concomitant medical procedures| WO2020068849A1|2018-09-25|2020-04-02|Visual Robotics Systems Inc.|Spatially-aware camera and method thereof| US11192253B2|2018-10-12|2021-12-07|Toyota Research Institute, Inc.|Systems and methods for conditional robotic teleoperation| WO2020131186A1|2018-12-20|2020-06-25|Auris Health, Inc.|Systems and methods for robotic arm alignment and docking| CN109605344B|2019-01-09|2021-02-09|北京精密机电控制设备研究所|Multi-degree-of-freedom open-loop stepping series mechanical arm and control method thereof| WO2020150165A1|2019-01-14|2020-07-23|Intuitive Surgical Operations, Inc.|System and method for automated docking| EP3696593A1|2019-02-12|2020-08-19|Leica InstrumentsPte. Ltd.|A controller for a microscope, a corresponding method and a microscope system| US11259807B2|2019-02-19|2022-03-01|Cilag Gmbh International|Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device| US10827162B1|2019-04-15|2020-11-03|Synaptive MedicalInc.|Augmented optical imaging system for use in medical procedures| WO2020243631A1|2019-05-30|2020-12-03|Icahn School Of Medicine At Mount Sinai|Robot mounted camera registration and tracking system for orthopedic and neurological surgery| US20200367977A1|2019-05-21|2020-11-26|Verb Surgical Inc.|Proximity sensors for surgical robotic arm manipulation| EP3753521A1|2019-06-19|2020-12-23|Karl Storz SE & Co. KG|Medical handling device for controlling a handling device| CN110464571B|2019-08-01|2021-02-12|南通市第一人民医院|Clinical auxiliary treatment bed for burns or scalds and using method thereof| US11234780B2|2019-09-10|2022-02-01|Auris Health, Inc.|Systems and methods for kinematic optimization with shared robotic degrees-of-freedom| US10959792B1|2019-09-26|2021-03-30|Auris Health, Inc.|Systems and methods for collision detection and avoidance| CN110613511B|2019-10-16|2021-03-16|武汉联影智融医疗科技有限公司|Obstacle avoidance method for surgical robot| EP3851050A1|2020-01-16|2021-07-21|Siemens Healthcare GmbH|Mobile platform| CN111407406A|2020-03-31|2020-07-14|武汉联影智融医疗科技有限公司|Head position identification device, intraoperative control system and control method| CN111700684A|2020-05-29|2020-09-25|武汉联影智融医疗科技有限公司|Rotating arm mechanism and integrated robot surgery platform| WO2022011538A1|2020-07-14|2022-01-20|Covidien Lp|Systems and methods for positioning access ports| CN112025705B|2020-08-26|2021-10-29|哈尔滨理工大学|Traditional Chinese medicine acupuncture system and method based on cooperative robot| CN112353495B|2020-10-29|2021-08-10|北京唯迈医疗设备有限公司|Intervene surgical robot arm system| CN113017857A|2021-02-25|2021-06-25|上海联影医疗科技股份有限公司|Positioning method, positioning device, computer equipment and storage medium| CN113040913A|2021-03-02|2021-06-29|上海微创医疗机器人(集团)股份有限公司|Mechanical arm, surgical device, surgical end device, surgical system and working method| CN112914729A|2021-03-25|2021-06-08|江苏集萃复合材料装备研究所有限公司|Intelligent auxiliary positioning bone surgery robot system and operation method thereof|
法律状态:
2018-11-13| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2020-01-21| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2021-10-26| B15K| Others concerning applications: alteration of classification|Free format text: AS CLASSIFICACOES ANTERIORES ERAM: A61B 19/00 , A61B 5/00 , B25J 13/00 , B25J 9/18 Ipc: A61B 5/00 (2006.01), B25J 13/00 (2006.01), A61B 17 | 2021-11-03| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2022-01-18| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 14/03/2014, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US201361801143P| true| 2013-03-15|2013-03-15| US201361800695P| true| 2013-03-15|2013-03-15| US201361801746P| true| 2013-03-15|2013-03-15| US201361800155P| true| 2013-03-15|2013-03-15| US201361801530P| true| 2013-03-15|2013-03-15| US201361818280P| true| 2013-05-01|2013-05-01| US201361818325P| true| 2013-05-01|2013-05-01| US201361818255P| true| 2013-05-01|2013-05-01| US201361818223P| true| 2013-05-01|2013-05-01| US61/818,223|2013-05-01| US61/818,255|2013-05-01| US61/818,325|2013-05-01| US61/818,280|2013-05-01| US201461924993P| true| 2014-01-08|2014-01-08| US61/924,993|2014-01-08| PCT/CA2014/050271|WO2014139023A1|2013-03-15|2014-03-14|Intelligent positioning system and methods therefore| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|